"ROSE Handbook"
Introduction, guidelines and underlying ideas

(Updated December 27th 2002)

 

 

 

 

 

 

 

ROSE[1] (The Relevance of Science Education) is an international comparative project based at the University of Oslo, Norway.

This document is meant as a 'handbook' for researchers who want to participate in the ROSE study. This document should be seen in connection with the ROSE questionnaire. It describes the practical details of taking part in the study ('logistics'), the rationale behind the ROSE instrument and describes the development of the instrument.

Other ROSE documents include the empty SPSS-file for data entry and the corresponding ROSE codebook with details about data entry. The ROSE documents can also be downloaded from http://folk.uio.no/sveinsj/

 

 

Contents

 

Overview: ROSE in brief 2

ROSE Guidelines and practicalities. 4

The ROSE questionnaire: underlying ideas. 8

Appendix: ROSE Questionnaire development 14

 

 


Overview: ROSE in brief

ROSE, The Relevance of Science Education, is an international comparative project meant to shed light on factors of importance to the learning of science and technology (S&T). It is also meant to sustain and develop the respect for and interest in S&T and S&T-related issues.

ROSE involves a wide range of countries from all continents. Key international research institutions and individuals work jointly on the development of theoretical perspectives, research instruments, data collection and analysis. The target population is aimed at are pupils towards the end of secondary school (age 15+), in many countries in the last year of compulsory education, and often the age when important choices are made.

ROSE is supported by The Research Council of Norway and The University of Oslo. In contrast to the well known large-scale comparative projects on pupils' achievement like TIMSS and PISA, ROSE is low-cost and less rigorous in its logistics etc. Industrialized countries cover their own expenses, while funding will be negotiated for countries with limited resources. Participation in the project may also enhance the possibility of releasing local funding for the participants.

We hope that researchers in individual countries will engage students (in teacher training or at Masters or PhD level) in the project, thereby enabling them to become involved in collaborative research of a critical and comparative nature.

 

Brief Rationale

A broad public understanding of S&T is crucial for national economical development and to the life, independence and autonomy of each individual. Scientific and technological literacy is also of great importance for citizenship and democratic participation in a world dominated by S&T-related issues and challenges. Falling recruitment and interests in S&T studies and careers are observed in many countries, mainly the rich ones.

The lack of relevance of the S&T curriculum is probably one of the greatest barriers for good learning as well as for interest in the subject. The ROSE project has the ambition to provide  insight into factors that relate to the relevance of the contents as well as the contexts of S&T curricula. The outcome of the project will be empirical findings and theoretical perspectives that can provide a base for informed discussions on how to improve curricula and enhance the interest in S&T in a way that

 

Ø      respects cultural diversity and gender equity

Ø      promotes personal and social relevance

Ø      empowers the learner for democratic participation and citizenship

 

ROSE is a further development of the project SAS (Science And Scientists)[2]. A more detailed description of ROSE as well as on the SAS project are found at http://folk.uio.no/sveinsj/  These documents are also available in print on request.

 

ROSE Objectives

  1. Develop theoretical perspectives sensitive to the diversity of backgrounds (cultural, social, gender etc.) of pupils for discussion of priorities relating to S&T education.

 

  1. Develop an instrument to collect data on pupils' (age 15) experiences, interests, priorities, images and perceptions that are of relevance for their learning of S&T and their attitudes towards the subjects.

 

  1. Collect, analyse and discuss data from a wide range of countries and cultural contexts, using the instruments referred to above.

 

  1. Develop policy recommendations for the improvement of curricula, textbooks and classroom activities based on the findings above.

 

  1. Raise issues relating to the relevance and importance of science in public debate and in scientific and educational fora.

 

The ROSE advisory group

An international advisory group has been established to serve as main partners in the development of the ROSE instruments etc. The members are:

 

·              Dir. Vivien M. Talisayon, The Philippines, e-mail: director@ismed.upd.edu.ph

·              Dr. Jane Mulemwa, Uganda, e-mail: jane.mulemwa@utl.co.ug

·              Dr. Debbie Corrigan, Australia, e-mail: debbie.corrigan@education.monash.edu.au

·              Dir. Jayshree Mehta, India, e-mail: satwac@wilnetonline.net

·              Professor Edgar Jenkins, England, e-mail: e.w.jenkins@education.leeds.ac.uk

·              Dir. Vasilis Koulaidis, Greece, e-mail: koulaidi@upatras.gr

·              Dr. Ved Goel, the Commonwalth, e-mail: goelv@commonwealth.int

·              Professor Glen Aikenhead, Canada, e-mail. glen.aikenhead@usask.ca

·              Professor Masakata Ogawa, Japan, email  ogawam@kobe-u.ac.jp

·              PISA project leader Marit Kjærnsli, Norway, e-mail: marit.kjarnsli@ils.uio.no

·              Professor Svein Lie, Norway e-mail: svein.lie@ils.uio.no

·              Dr. Marianne Ødegaard, Norway e-mail: marianod@bio.uio.no

·              Cand.scient Astrid T Sinnes, Norway e-mail: a.t.sinnes@ils.uio.no

 

Time schedule

ROSE started in September 2001. The project leader is professor Svein Sjøberg. A full time researcher, Camilla Schreiner, works on the project as a part of  her doctoral degree at the University of Oslo.

An international working seminar with the ROSE advisory group was held in Oslo in October 2001. The aim of this working seminar was to discuss and develop the research instruments as well as the logistics of data collection. Twelve researchers participated, representing different cultures and all continents. The further development and piloting of research instruments and logistics took part through 2002. Preliminary versions of the instruments have been circulated for comments among researchers in the advisory groups and other international partners and have been discussed in seminars and workshops etc. (Details  are given in the Appendix.)

The ROSE instrument as well as the handbook (this document) were finalized in November 2002Researchers from all countries are invited to participate in this joint study. At the time of writing, the following countries have joined the project. Australia, Austria , Brazil, Cameroon, Denmark, England, Estonia, Finland, France, Germany, Ghana, Greece, Iceland, India, Israel, Italy, Latvia, Lesotho, Malaysia, Malta, Norway, Philippines, Poland, Portugal, Russia, South Africa, Spain, Sweden, Switzerland, Trinidad and Tobago West Indies, Turkey, Uganda, Zimbabwe. Many other have expressed an interest, and are likely to join. Most countries will collect data in the early part of 2003.

The research is based on cooperation with the intention that participants will learn from each other. The data that are produced, will in due time be made available for all participating researchers. Details are given later in this document.

Now, when the ROSE instrument is finalized, the study is in principle open for participation. Details about procedures etc. are given on the following pages.


ROSE Guidelines and practicalities

This part of the document describes details for ROSE participants on practical matters as well as 'rights' and 'duties' related to the ROSE study. We need to ensure a minimum quality of the data. If data files are to be part of the international analysis, certain requirements need to be met. (Data not meeting requirement may still be used  as 'stand-alone' in your national studies and for (careful) comparisons with the international data.)

Translations of the ROSE questionnaire

The 'original' ROSE questionnaire is in English and is provided to participating researchers as formatted rtf-document and/or as a word-file in A4 paper size.

For data collection ,use the questionnaire in the language of instruction. When there is a need for translation, please keep exactly to the given format (layout, page breaks etc), and just replace the original English text with your own.

If the text in your language needs more space than the English original, you may need to adjust the margins to avoid changes in page shifts.

The font in the questionnaire is Arial, and the squares for pupils' responses are also in Arial. If your word processor does not support this font, you may find that the squares are 'translated' to other symbols. Try to avoid this!

If you need to translate the questionnaire to another language, you may contact the ROSE organizers to avoid duplicate translations to the same language. Please aim at making the meaning in your language identical to the English items.

National items and questions

We have tried to make an instrument that can be used in a large variety of cultures. (In fact, to give an account of this variety is a key idea behind the project as such!) It may nevertheless be desirable for some nations to include additional national items or even new questions. The need for local adaptation has been balanced against the need to keep data collection and coding as simple as possible. On the first page of the ROSE questionnaire, you may add some questions (max 4) to allow you to ask for background data like region, school district, school type etc.

At the end of the instrument you may also add your own national items, for instance to give a more local flavour to some of the questions in the questionnaire (like experiences, interests, future plans etc.) At the end of the questionnaire you may also add more questions to serve as about background variables (about home and family background, urban vs. rural etc.)

Target population(s)

In principle, the ROSE target population is the cohort of all 15 year old pupils in the nation, or more precise: the grade level where most 15-year old pupils are likely to go. This is, in many countries, the last year pupils attend lower secondary school, and it often coincides with the end of compulsory schooling. In many countries, this is the last year before streaming according to educational choices or other forms of selection takes place. (These considerations are not equally valid for all countries and educational systems.)

ROSE tries to shed light on the range and the variety of pupils' experiences, interests, perceptions etc. in issues related to S&T. The vast variation in types of countries and cultures has implications for the definition of the target population:

Some countries are rather homogeneous and 'mono-cultural'. Here it makes sense to talk about national averages etc. Other participating countries have large variations due to geography, differences in culture or ethnicity, level of economic development etc. In such cases it may not make sense to calculate national averages. (In fact, one may loose sight of the educationally interesting variety by calculating national means!) In such countries, one may consider to define the target population as a more homogeneous subgroup, for instance a 'state' or a particular administrative or otherwise clearly identifiable unit. As a consequence, in such countries one may prefer to define more than one target population, or one may define identifiable strata in the national population.

Furthermore, the national researcher's economic and human resources differ between the participating countries. Based on the local national circumstances, one may define an accessible population that is smaller than the whole national pupil cohort, for example as a cultural or geographic defined group as indicated above.

Whatever choice one makes, care should be taken to be explicit in the definition of the target population. This is important in order to avoid later confusion or unwarranted conclusions to be drawn. If there are questions about how to define a suitable population, please discuss them with the organizers.

Sampling

The sample should be drawn so that it represents the target population as defined above. For practical reasons the sampling unit is likely to be the school class (and not the single individual). This implies that whole classes are expected to take part in the study. Using whole classes does, however, reduce the variability, and hence the 'effective sample size'. One should therefore as a rule use only one school class from each school to avoid further reduction of the effective sample size.

The sample should be drawn from the class level with the highest proportion of 15-year old pupils. Within the defined target population, one should identify the existing schools, preferably from available statistical school administration data. In some countries educational or statistical authorities may assist in providing such lists as well as providing a representative sample. From the list of schools, one should draw at random a specified number of schools for participation. If school size vary considerably, one may use proportional sampling in order to get a representative sample. This means that before drawing, the school should be given a weight that is proportional to the number of pupils.

At each school, only one class should take part. Take care to make a representative selection of type of school, if these exist (girls' and boys' schools, boarding schools, etc.) The type of school may be one of your nationally defined background variables as indicated above.

One should aim at a minimum of 25 participating schools – preferable more. With 'normal' class sizes of about 25, 25 schools should give a minimum of 625 respondents. (If you plan for 25 schools, be sure to sample a considerably higher number, since you are not likely to get a 100 % response rate!)

If you want to compare sub-groups within your national population, you should go for larger samples than indicated above to ensure that you contrast groups which are sufficiently large.

Preparations take time!

Please be aware that the preparations for the actual data collection may be time-consuming! Data collection should take at the earliest convenience. The international data analysis will start in the beginning of 2003. We have, however, not yet decided on any definite time limit for data collection.

In most countries you may need official permission to gain access to the schools and pupils to collect data. In some places you may even need such permissions on a regional level. And you certainly need to get permission at each school, possibly at the 'top' level, but certainly at the classroom teacher level. Some countries even require permission from the pupils' parents.

These practical and legal constraints vary from country to country, and the best way forward must be determined by each researcher (or group). Do not underestimate the time that this may require. In this planning process, many 'local' decisions are likely to be taken. Please take care to describe these as clearly as possible when data are submitted.

If a letter of recommendations from the ROSE organizers will help you in getting the necessary permission, we will provide this. It is a good idea to start preparing for data collection at the earliest opportunity.

Administration of questionnaire

The ROSE study is not a test, and there are no correct answers that can be used for ranking according to some pre-determined measure of quality. Hence, there is no need to be strict in the guidelines for administration and data collection. The important thing is that we get reliable and honest data, and that the pupils understand the questions. They should also be given enough time to complete the questionnaire. Pilot testing has indicated that one normal lesson (about 40 min) is sufficient time, but this may not be enough when there are problems with the language etc. Please ensure that the pupils get time to answer all questions. The administrator may even explain questions where they are not fully understood. One may even consider the possibility of completing the questionnaire as homework.

The questionnaire should be presented by the normal class teacher, but the researcher may assist and supervise. After the completion and collection of the questionnaire, the researcher or teacher may fill in the necessary school code or other information on the front page for later identification. At a later stage (during data entry), all questionnaires from each country should be given a unique identification number for easy retrieval in case of corrections etc. The open-ended question may be coded later, hence the identification number is essential.

Coding of data

Each participating researcher (or group) must follow precisely the common guidelines for data entry. We will use SPSS (Statistical Programme for the Social Sciences) as the instrument for analysis, but Excel may be used for data entry if SPSS is not available. Empty data files in SPSS and Excel format will be provided. The corresponding code book with the necessary information for data entry will also be made available.

The first page in the questionnaire contains a few background data about the respondent. Additional information might be added by the researcher (or the teacher administrating the questionnaire). Each national researcher has to decide what background information one needs The ROSE instrument and data file has, as mentioned, set aside 4 extra variables for this purpose to be included at the first page. These may be the name of school, type of school, region etc.

The coding will be made as easy as possible. Details will be apparent from the code book and will also appear as 'legal' values in the empty data file that is provided. As a general rule, the actual position of the respondents' tick will be the value to be entered (a tick in the first box will be entered as '1', a tick in the second box will be coded as '2' etc. and no response will be coded as '9') Each page shift in the questionnaire will be coded with the letter 'x', since this will ensure that a possible mistake (e.g. a shift in position) can be easily detected. Details will be given in the code book.

The open question ("Myself as a scientist") at the end needs interpretation before coding, and details will be provided. These data will for practical reasons be coded separately. It is therefore important that each questionnaire is identified by the running number as indicated above.

Cleaning of data

Since only the coded files (and not the questionnaires) are returned to the ROSE organizers, it is essential that the data are properly cleaned to avoid mistakes, since these cannot be traced and corrected by the organizers! In any case, we ask you to keep the original questionnaires to be able to trace possible mistakes at a later stage.

There are many ways of cleaning data to ensure quality. If you use SPSS for data entry, you may for instance run frequency tables for all variables to search for values outside the 'legal' range. Some details and suggestions for data cleaning and proof-reading will be provided in the code book.

Return of data files

When you return data, please provide as detailed information as possible about the definition of population and the selection of the sample. Describe the underlying considerations, whether these are of a practical nature or based on educational or other concerns. You may send us the data file as an attachment to e-mail, or as a diskette. The format may be either SPSS (preferably) or Excel.

"Rights and duties"

ROSE is intended to be a collaborative work, where all researchers contribute and benefit. Participating researchers may conduct their own research on their national material, given the following guidelines:

 

  • All national reporting should pay proper credit to the project with a suitable reference to the ROSE project and its organizers.
  • International reporting by the organizers should also pay credit to the ROSE project and the participating researchers who have contributed to the international data file.
  • National reporting should take place only when the whole international data collection is finalized. (Exceptions may be given to this, for instance when students collect data as part of their teacher training or for essays or degree work.) Please contact the organizers if you are in doubt.
  • When the international initial report is published, all participants will have access to the whole ROSE data file, and may use this for their own research (any reporting must of course give credit to the ROSE project and explain the background)
  • Copies of all papers based on the ROSE data should be sent to the organizers when published.
  • The organizers will send all international ROSE reports and papers to all participants when available.
  • A ROSE web site will be established. This will contain all relevant information about the project, and will at a later stage contain all publications. For the moment, ROSE documents will be placed at http://folk.uio.no/sveinsj/

Additional qualitative data?

With a standardized questionnaire one may compare responses from large groups and from widely different cultures. But data collected with questionnaires have obvious limitations. It is not always easy to interpret what pupils have had in mind when they simply tick boxes in a pre-determined questionnaire. This is the limitation of this type of research. We have left only one question open for free response through writing (Question I: "Myself as a scientist"), and details of coding will be described in later communications.

         In order to give more nuance to the 'hard data' from the questionnaire, we suggest that one should accompany the ROSE data collection with interviews with some of the pupils. This may shed light on how they may think when they answer the questionnaire. This sort of information may be of value when drawing conclusions and interpreting results.

Funding

The basic funding for ROSE is from the Research Council of Norway and The University of Oslo. This, however, only covers a full-time researcher, related costs in Norway and very limited international support..

Participating researchers are requested to seek local funding to support their own participation. Researchers in OECD countries are expected to finance their own participation, while developing countries and not so wealthy countries may need assistance. Some countries may 'sponsor' the participation of other countries.

The ROSE team will assist in this process of seeking funds, for instance by writing letters of support. One should also mention that when the Research Council of Norway decided to fund the ROSE project, this decision was based on judgements given by international referees. Participation in an international research project may enhance the possibility of securing national funding.

By the end of 2002 we also received additional funding from the Norwegian Ministry of Education to support other countries. Priority will be given to support participation of countries with weak economies. If more resources become available, the intention is also to arrange workshops and/or training seminars. We are rather optimistic about such possibilities.

Involving students?

Many of the researchers involved in ROSE are involved in teacher training and/or degree work in science education at Master or PhD-level. It may be a good idea to use participation in ROSE in connection with such work. Many countries ( for instance all the Nordic countries) have already indicated that they will do so, at the PhD as well as the Master level. PhD students from 5 different countries have already decided to base their thesis on ROSE. Students may of course be involved in different aspects of the study, in data collection, or through writing essays or thesis work based on the results.

The ROSE questionnaire: underlying ideas

All comments below refer to the ROSE questionnaire. Although many questions are more or less self-explanatory, some clarification may enable you to understand why we have asked these questions, and what we may get out the responses.

Background questions

The first page contains a short introduction as well as a few background questions (sex, age and nationality). These are the only variables we will use for international comparisons. In addition we suggest that the national teams may use some further specification that they may find useful in order to contrast groups they may find interesting (e.g. region, type of school etc.) We leave 4 empty variables in the codebook for such variables. We suggest that you should not add such variables unless you want to use them in your analysis.

Educational research shows the importance of 'out-of-school' variables like family, neighbourhood and the wider community in which the pupils live. We have received many comments to our question of whether or not to include more background variables. Many have suggested that we should include more questions about the social background of the pupils. The large scale OECD/PISA[3] and IEA/TIMSS[4] studies have numerous questions regarding the socio-economic background of the pupils -- like how many members there are in the family, living conditions, parents' education and occupation, parents' income, whether the pupils have their own PC, the number of rooms in their home etc.

We are hesitant to ask for more personal background than we do. One reason is that the ROSE study covers such a wide variety of cultures that it may be impossible to ask questions about family background that can provide meaningful comparisons. Besides, we are in general not trying to 'explain' our findings by reference to such variables.

Such background variables may, however, be important in national analysis in some of the participating countries. As indicated, we therefore leave four open places for such variables at the first page of the questionnaire. Each country is also free to add more national questions at the end of the questionnaire, and you may the necessary additional variables to the code book in order to meet your requirements. Some of the ROSE researchers are also active in the PISA study, and they may use some of the PISA background questions to open the possibility for linking your data with the PISA material.

We have included the question "How many books are there in your home ?" in the ROSE instrument. This item is a true copy of a question used in the PISA study, since this question has turned out to be a very good proxy for the socio-economic status of the home of the pupils. Although ROSE does not have the same focus as PISA and TIMSS on scholastic achievement and pupils' performance, the socio-economic background of the pupils may be interesting and relevant for ROSE in a science education perspective aiming at science for all.

In the questionnaire translation, countries using non-metric measures may need to convert the term "40 books per metre" to their non-metric equivalents.

"What I want to learn about" (Questions A, C and E)

This is a rather lengthy question with many items. To avoid fatigue from the pupils, the question is divided into three parts: question A, C and E. The idea about this question is to get empirical evidence on what sort of issues pupils are interested in learning about, and to explore how these vary between groups and to search for patterns in the answers. This question may provide insight into how different topics may appeal or not to different groups of learners. This information can give us insight into how science curricula may be constructed to meet the perceived needs or interests of different groups of pupils – and how this may vary between different countries and cultures.

 

Contexts -->

Contents

Natural phenomena

Technological applications

Care, health

Spectacular aspects

Philosophical aspects

 

 

 

Astrophysics

Item a

 

 

Item g

Item i

 

 

 

Earth science

 

Item c

 

 

 

 

 

 

Genetics

 

 

 

Item h

Item j

 

 

 

Zoology

 

Item d

Item e

 

 

 

 

 

Optics

Item b

 

 

 

Item k

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Item f

 

 

 

 

 

 

 

 

 

 

Item l

 

 

 

 

The table above indicates the logic behind the question: Each item may be classified by two 'dimensions'. The first (vertical) is the contents or subject matter area.

The second (horizontal) is the context in which the subject matter is placed.

 

Subject matter contents is often classified by the 'usual' scientific concepts, often the key words in the different sciences. These are the topics as they often occur in curricula and textbooks. Some examples are indicated in the table above.

Our list of contents is by no means a comprehensive list of all the possible ones from the sciences! We have chosen to concentrate on the following subject-matter areas:

 

-        Astrophysics    

-        Earth science   

-        Human biology with sex and reproduction

-        Genetics

-        Zoology

-        Botany

-        Chemistry

-        Light and optics

-        Sounds and acoustics

-        Electricity and energy

-        Technology

-        STS (Science, Technology and Society) and NOS (Nature Of Science)

 

The horizontal classification by contexts can also be done in may ways, and a few examples are indicated as headings in the table above. The classification scheme suggested here cannot be applied strictly, but rather as a kind of guiding principle. Our considerations can therefore only be seen as a device to assist our thinking, and to ensure that we cover a wide span of contents and contexts. We have tried to put the subject-matter listed above in the following types of contexts (not mutually exclusive):

 

-        Natural phenomena and nature study

-        Spectacular phenomena, horror, frightening examples

-        Humankind and human life and body

-        Technological ideas and inventions

-        Philosophical aspects; science and non-science or quasi-science, religious implications or belief-oriented

-        Aesthetical aspects, beauty

-        Care, health, protection and improvement of living conditions

-        Personal use and everyday relevance

 

Some of these contexts have been derived from our reading of sociological and psychological theories about identity, youth culture, modernity etc. Some of the items may seem controversial and unusual in a S&T educational context, e.g. items regarding ghosts, horoscopes, mind-reading, clashes between science and religion, etc. The inclusion of these items does not mean that we think these issues are legitimate parts of an S&T curriculum! They are included because we want to explore the variety of pupils' interests -- also in some unusual contexts. The analysis of the data from these items may illuminate questions like:

 

·        How does the context versus the subject matter area influence young peoples interests?

·        Do girls' interest scores in contexts or subject matter areas differ markedly from boys'?

·        What kind of interests among young people seem to be shared or universal, and what seems to be influenced by their cultural background?

 

We also want to see the results from this question in relation to results from other questions.

"My future job"  (Question B)

This question provides information about the future priorities and motivations of the pupils. This is in itself interesting information, and allows for comparisons across cultures and between boys and girls. Results from the SAS study indicate that interesting patterns emerge through factor analysis. Of particular interest is this item when the results are connected with the responses to other ROSE questions, for example responses to the question above, What I want to learn about, as well as to other questions, like Me and the environment, My science classes and Myself as a scientist. Some of the items are not directly connected to S&T, but the question may provide a framework for understanding responses in other questions.

"Me and the environment"  (Question D)

This question seeks to explore how pupils feel about one group of important challenges, those related to the environment. We want to explore whether they think these issues are serious or not, whether they feel personally involved and to what extent they feel empowered to influence the solutions of important social and political issues. We have chosen the challenges of the environment, since this is a global challenge. We are well aware that other issues may be more urgent in some parts of the world, like e.g. AIDS/HIV. But since the importance of this issue (and most others) varies so strongly from one country to another, we decided to concentrate on problems of the environment. The issue of environmental protection is closely connected to the realm of science and technology, consequently the data analysis and the discussion of the results can be related to implications for science teaching.

The question is designed to tap different aspects of pupils' relationship to the environmental challenges, like the following:

 

·        To what extent do they feel that environmental challenges are important?

·        Do they have a sense of control and feel that they can make a difference?

·        Do they hold it to be the responsibility of every individual to confront the problems, or do they think that somebody else should solve the problems? (e.g. the experts, the rich countries, technology)

·        To what extent are they optimistic about the future?

 

This question also has some items that will provide information about the pupils' perceptions and attitudes to matters like science and technology, animals 'rights', the possible 'sacredness' of the natural world, etc. The responses will give us some information about the cultural background of the pupils, and may be valuable for understanding responses given in other questions in the questionnaire.

"My science classes" (Question F)

This question provides information about different aspects of the pupils' perception of their science classes, like their motivation for science at school, their self-confidence in their own abilities in science at school, what they get out of science at school, their perceptions of the necessity of science education etc. We know that aspects like self-confidence, attitudes, interest and motivation are key factors associated with learning. The responses will make it possible to describe what pupils in different countries actually think they have learned from their science classes.

The terms 'school science' and 'science at school' in this question refers to the education in science (biology, physics, chemistry, geology, geophysics, astronomy, etc.) and technology that the pupils get at school. Each nation is requested to substitute this term with the proper name of the corresponding school subject in their country. In the Norwegian lower secondary school, for example, the school subject corresponding to science is entitled 'Natur og miljøfag' ('Nature and environment'). In England and Wales the subject at school is simply called 'Science'. In any case, it is important to use an expression that refers directly to the school subject science (and not to science in a more general sense).

"My opinions about science and technology"  (Question G)

This question probes different aspects of how the pupils perceive the role and function of science and technology in society. We explore their possible trust or distrust, their interests and support etc. Many of the questions are copies of questions used in large scale public surveys like the Eurobarometer[5] and similar surveys in other parts of the world. The responses are interesting in themselves, and may also be compared with the corresponding replies by the adult population in many countries. We may also explore how the responses on these items are related to responses on many other questions in the ROSE questionnaire.

"My out-of-school experiences"  (Question H)

This question provides information about pupils' experiences out-of-school; activities that may have bearing on their interests in S&T and that may provide important experiences for the learning of science at school. Responses to this question will give teachers, curriculum makers and textbook writers a description of what kind of S&T-related experiences children bring to school, and how these vary between girls and boys and between diverse cultures.

Meaningful teaching should build on the learners' experiences, and this question can provide such insights. It may for instance shed light on whether the dominating form of science teaching (or textbooks) favours certain groups of pupils at the expense of others. Responses to this questions can also be analysed and seen in relationship to answers to other questions, for instance What I want to learn about.

"Myself as a scientist"  (Question I)

This is the only open-ended question, where the pupils are invited to express opinions with their own words. The question has two parts. The first asks about what they would like to work on, the other asks for reasons for this particular choice. The first part may be analysed in terms of a classification by problem area or subject-matter. (e.g. medicine, space exploration, computer technology etc.), while the second part may be analysed in terms of personal motivation and values (e.g. helping others, personal interest or curiosity, seeking money and success etc.)

Details of the coding scheme will be developed later.

         The responses to this question may be discussed in terms of differences between girls and boys, between different cultures etc.  the results may also be explored for possible relationship with other questions, e.g. "My future job", "My opinions about science and technology", "What I want to learn about" etc. 

General comments

Response categories

Some of the ROSE questions present a set of statements. For each statement the pupils are asked to indicate their response by ticking the appropriate box. We use Likert scales with categories for Disagree-Agree, Not interested – Very interested, Not important – Very important, etc.

A Likert scale with five response categories can have a neutral middle point. We have chosen four response categories, and left out the neutral middle point. We have also omitted the 'Do not know' category. However, in the introduction to each question we state that they may refrain from ticking any boxes if they do not know what to answer.

The rationale behind this is that previous experience has taught us that it seems too convenient for people not to take any stance, ticking the neutral middle point or the 'Do not know' category. Consequently we get 'too many' scores on the neutral boxes, and data lacking the diversity we are searching for. Without a neutral middle point and a 'Do not know' category, the respondents are in a way 'forced' to take a stance. This can of course be debated: On the one hand, people should have the right to remain neutral to an issue. On the other hand, the neutral middle point often differ markedly from the regression line in correlation analysis, causing complexities in the data analysis.

Likert scales in the ROSE questionnaire have headings only for the extreme categories, while the two categories in the middle of the scale are untitled. This is done because, firstly, in this way we avoid the task of making good and balanced titles, and secondly, the translations of the titles to other languages will be more straightforward. In interviews with some respondents, they did not point out the lack of titles as problematic. We are more inclined to believe that the respondents did imagine the space between the extreme boxes as a continuous scale, with the untitled boxes dividing the scale in four equal sized intervals.

 

Reliability, validity, etc.

The ROSE instrument is designed for exploring the variations in interests, experiences, empowerment, priorities, perceptions, attitudes, etc. However, there are no direct means for measuring affective dimension such as exist in the physical sciences for the measurement of length, weight, etc. The ROSE instrument has indirect means that involve a number of items that are selected to serve as indicators of some more complex constructs.

In considering validity one asks how useful or meaningful the instrument is for measuring the intended dimensions. The ROSE questionnaire has been developed through a rather thorough process involving many different actors (details are given in Appendix). This process has, in addition to being crucial for the item development, also served as some kind of triangulation or validation of the instrument:

 

-        pupils, teachers and researchers from many countries have given their ideas for items in the questionnaire

-        pupils and researchers from many countries have promoted the gradual improvement of the items by review of face validity

-        pupils have given their responses in interviews, discussion groups and in pilot studies

-        inter-item consistency have been considered in the Norwegian pilot data analysis

 

Sufficient inter-item consistency in the data analysis with data collected from Norwegian respondents, does not guarantee the same reliability or validity in the international material. The Norwegian pilot study may not be sufficiently broad to accommodate what is found in other cultures and other education systems. On the contrary, cultural diversities will imply that an item that measures a particular construct in one country will measure something else in other parts of the world, and that questions that are meaningful in one culture may seem strange or even not understandable in other cultures. This implies that a valid questionnaire in one culture, may under other circumstances in other cultures be valid in different ways. The ROSE project is a low-budget study, without finance for an international pilot test and subsequent international data analysis. Consequently, the data analysis after the international data collection must be accompanied with some cross-cultural validation of the questionnaire.

The ROSE study has an explorative character, it does not test pupils against a set of predetermined 'constructs' or standards. ROSE aims at stimulating discussions about contents and contexts in science curriculum in diverse cultures. The project is more likely to generate hypothesis and theories than to confirm or reject a given hypothesis. In many cases, reporting on single items may be more fruitful and telling than using theoretical constructs. (This is often also done in large surveys like the Eurobarometer).

As mentioned, psychometricians often ask for clearly defined 'constructs', often to be measured by means of a large number of items that explore the same underlying idea. In our pilot phase we investigated this issue in some detail, and we also interviewed respondents. They were often frustrated when they discovered that we asked 'the same question' in different disguises. We have tried to balance 'ideal' statistical requirements against common sense reasoning based on interviews etc. We have therefore willingly avoided  asking 'the same question' in many different forms.

We have also tried to avoid 'negative statements': It turns out to difficult to understand what it means to disagree with a negative statement (even for adult respondents!) In the ROSE project, many respondents will get the questionnaire in a language that is not their mother tongue (for instance in many African countries.) This is also an argument for keeping the language as simple as possible and to avoid (double) negative statements.

As one may understand, we have made several compromises during the construction of the questionnaire. Test-theoretical and psychometrical requirements have been carefully balanced against practical requirements based on the nature of the study (exploring, not ranking), limited material resources, a wide spread in cultures etc. When we were in doubt during test development, we have relied more on the face validity as judged by our group of international experts than on correlation coefficients. Details of the process behind the questionnaire development are described in the Appendix.


 

Appendix: ROSE Questionnaire development

The ROSE project is in many ways a continuation of the SAS-project, and the ROSE questionnaire is a further development of the instrument developed in SAS. ROSE aims at exploring the richness and variation in pupils' experiences, priorities, interests, perceptions, attitudes, etc. that can have a bearing on discussions about how to make science curricula more relevant and meaningful. To ensure that the ROSE instrument is broad enough and has sufficient span to capture the diversity of interests etc. in a variety of cultures, the instrument has been developed through a rather lengthy process starting September 2001 and ending November 2002. Here are some of the details.

 

Workshop with ROSE Advisory Group

Questions that are meaningful in one culture may seem strange or even not understandable in other cultures, consequently international contributions have been crucial for the questionnaire development to ensure a cultural balance. In October 2001, a number of researchers from all continents got together in a workshop near Oslo in Norway. The purpose of the workshop was to discuss and collect theoretical perspectives for the research, input to questionnaire items and advices for the guidelines for sampling and logistics. The ROSE Advisory Group was established. (The names are given in the Overview in this Handbook)

 

Preliminary studies in Norway

We found it important to hear the voice of the people that have their everyday life in school, and conducted during January-February 2002 some preliminary studies with Norwegian pupils and teachers. The purpose of these studies was to generate new ideas and different points of views for science education. The preliminary studies involved:

 

-        Two discussion groups with five pupils in each group. The pupils discussed scientific and technological issues that they were enthusiastic about, and what use they had made of knowledge from science at school in their life out of school.

-        Written lists from about 80 pupils on scientific and technological topics and contexts that they found interesting, and what use they had made of knowledge from science at school in their life out of school.

-        Conversations with six science teachers to identify what kind of topics, in their experience, the pupils are interested in, what the teachers perceive as the most important intention behind science at school, and how they wish their teaching of science can prepare and equip young people for their adult life.

 

First draft ROSE questionnaire  

The outcome of the workshop was revised, taking the responses and ideas from the preliminary studies into account. In April 2002 a draft version of the questionnaire was distributed to the ROSE Advisory Group and to all researchers that had expressed an interest for participating in the ROSE project.

The most important purpose of this international draft consideration, was to guarantee that the questionnaire was broad enough to capture cultural diversity. We asked for item suggestions that would reflect particular circumstances of their country or culture. They were also invited, if time and resources could allow it, to conduct some kind of preliminary studies with pupils and/or teachers in their own country.

We received a considerable number of responses from a wide range of counties, and some countries had even piloted the questionnaire. The diversity of the recommendations and suggestions we got, showed us some of the challenges we are facing when developing a cross-cultural questionnaire like ROSE. Some of the comments draw us in very different directions, and some even seemed to be in contradiction to each other.

 

Piloting

While the questionnaire was distributed for the first international considerations, the questionnaire was translated to Norwegian and piloted in five Norwegian school classes. Some 130 pupils answered the questionnaire. The purpose of the pilot test was to collect experiences from teachers and pupils on how the survey was carried through, their spontaneous and unrestrained reactions to the questions, the organisation and duration of the survey etc. This information was obtained by:

 

-        requesting the pupils to indicate the questions that they did not understand; a word or the meaning

-        subsequent conversations with the teachers

-        discussion with a group of eight pupils about their attitudes to and understanding of the questions

-        data entry and quantitative analysis in SPSS

 

GRASSMATE meeting

In June 2002 we also had the opportunity to meet the GRASSMATE (Graduate Programme in Science, Mathematics and Technology education) group in Bergen in Norway, a Sub-Saharan African group of supervisors for African doctoral students. The supervisors were key science educators from several African countries and from Norway.

We jointly elaborated on the questionnaire, and received some useful comments. Particularly the input on the cultural balance and the discussions about different values and worldviews in different cultures, were enlightening and fruitful for the questionnaire development.

 

Second draft ROSE questionnaire

After considering all the input from the first draft distribution, the Norwegian piloting and the GRASSMATE meeting, we revised the instrument taking these into account.

In August 2002 the new version of the questionnaire was distributed to the ROSE Advisory Group and to the researchers on the list of possible partners in the ROSE project for second considerations. We asked for

 

-        feedback on cultural and gender balance

-        advices on where the questionnaire could be shortened

-        suggestions for simpler and clearer alternatives for the English language and wording

 

We received numerous responses from a wide range of countries. In fact, we got input from Russia, Finland, Uganda, Denmark, Sweden, Norway, The Philippines, Korea, USA, United Kingdom, Canada, Greece, South Africa and Ghana.

It was not easy to incorporate all these comments. Nearly everybody indicated that the questionnaire was too long, and nearly everybody came up with new ideas that they suggested incorporated in the revised version!

It is also interesting to note that some researchers commented on what they considered to be a western and middle-class orientation in the instrument, while others indicated that we had a bias in favour of developing countries as well as an ideological orientation towards 'alternatives' at the expense of 'proper science'. This shows that difficulties of trying to make an instrument that can be used across cultures! It may also be seen as an indirect support to our ideological underpinning, i.e. that one should be careful in making global 'blueprints' for what constitutes school science contents! In any case, we have tried to balance the different inputs by making several compromises. 

 

Interviews

While the questionnaire was on its second trail, we conducted individual interviews with two Norwegian pupils in connection with their questionnaire responses. The purpose of the interviews was to consider their spontaneous and unrestrained reactions to the items and how consistent their perceptions of the items were with our intention behind them.

 

Third draft ROSE questionnaire

We tried to form a satisfying whole by taking the input from all our contributors and collaborators into consideration and by balancing these comments with the aims and objectives of the project. It has been a challenge to ensure that nothing is unjustifiably emphasized at the expense of something else.

In September 2002, the last draft was distributed to the ROSE Advisory Group and to all international ROSE partners. This time we asked for only smaller suggestions for item adjustments and for comments on the two open questions at the end of the questionnaire.

Again we got valuable input from many international scholars. A last discussion took place in mid-October, when we had professor Edgar Jenkins as a visitor. He is also a member of the advisory group, and has provided considerable input to the development of the ROSE project since we started. he current version of the ROSE questionnaire was finalised at the beginning of November 2002.

 

 

 

Oslo December 27th  2002

 

 

 

 

Svein Sjøberg               Camilla Schreiner



 

[1] © Dept. of Teacher Education and School Development, University of Oslo,
Box 1099 Blindern, 0317 Oslo, Norway and Svein Sjøberg (svein.sjoberg@ils.uio.no)

 

[2]  Sjøberg, S.(2002) Science for the children? Report from the Science and Scientists-project -- Acta Didactica. -(1/2002) :  (ISBN 82-90904-65-7)  Dept. of Teacher Education and School Development, University of Oslo

 

[3]  PISA reports etc. are available at http://www.pisa.oecd.org/

[4] TIMSS reports etc are available at http://timss.bc.edu/

[5] See e.g. http://europa.eu.int/comm/public_opinion/