It may be cold outside, but there’s a warm welcome in Hamburg!

By Nelladee McLeod Palane. University of Pretoria, South Africa

Arriving in Hamburg in January, directly from Pretoria, nothing could have quite prepared me for the cold in Germany, but even the icy weather contributed to the refreshing and stimulating nature of the intellectual experience!

In January, I was privileged, as part of my doctoral studies, to participate in the Academic Visitor Program hosted by the Research and Analysis (RandA) Unit of the International Association for the Evaluation of Educational Achievement (IEA).  This program is designed to provide opportunities for collaboration between visiting researchers and RandA staff members, to promote networking within a worldwide research community and the international exchange of ideas in educational evaluation. In my case, I was there to learn more about the analysis techniques that could support my own research project directly from senior IEA researchers.

I am thankful that this visit was made possible by my doctoral supervisor, Professor Sarah Howie, and the University of Pretoria’s Postgraduate Study Abroad Program. I wanted to learn more about multilevel modeling for analysis of the PIRLS (Progress in International Reading Literacy Study) assessment (also known as prePIRLS) data, and how to understand and apply the latest research techniques.

At the IEA Hamburg, I gained a wealth of insight into how to comprehensively analyze PIRLS contextual data using Mplus software. My PhD study aims to examine the effects of home and school contextual factors on performance in higher level reading comprehension processes within the context of the differing language of instruction models found in South African primary schools. I was given the desk space and facilities to work independently at the research center, while drawing regularly on the kind guidance of IEA colleagues Dr Agnes Stancel-Piątak, Falk Brese and Nadine Radermacher. Working at the IEA has provided me with much appreciated exposure to structured expert meetings concerning the PIRLS 2011 database, sampling design and scaling methodology.  This information was extremely helpful for my PhD studies, and continues to inform my participation in the PIRLS 2016 research activities at the Centre for Evaluation and Assessment in South Africa.

I am deeply grateful to Dr Sabine Meinck, Head of both the Research and Analysis, and Sampling Units, for the opportunity to be equipped in this way for my further research into the PIRLS data.

 

For more information about Nelladee’s research, please contact her at nelladee.palane@up.ac.za

Researchers and students interested in the IEA’s academic visitor program, which enables visiting researchers and RandA staff members to come together to exchange ideas and develop joint projects, can learn more at http://www.iea.nl/academic-visitor-program.

Using TIMSS and PIRLS data for secondary analyses: A workshop in Morocco

By Yasin Afana

On 5–8 September 2016, the International Association for the Evaluation of Educational Achievement (IEA) provided a workshop on ‘Using TIMSS and PIRLS Data for Secondary Analyses’ to the National Authority of Evaluation at the Higher Council of Education, Training and Scientific Research in Rabat, Morocco. This training course was developed following consultation between the IEA and the Moroccan National Authority of Evaluation to establish the training criteria, and their desired outcomes and long-term objectives.

The workshop provided an overview of the aims of the IEA’s Trends in International Mathematics and Science Study (TIMSS) and Progress in International Reading Literacy Study (PIRLS) studies, their theoretical and measurement frameworks, and the most appropriate methods of analysis for the data. Participants were also given training in how to develop and implement self-designed analyses plans, using the IEA’s own well established software tool, the IDB Analyzer.

The training was conducted in three parts. The first part was a brief review of all the information released from previous TIMSS and PIRLS studies, namely the theoretical frameworks, international reports, encyclopedias, technical reports, user guides, and international databases. The second part was an in-depth introduction to the sampling and survey design of large-scale assessments, with a specific focus on TIMSS and PIRLS studies, and their implications for statistical data analysis. In the third part of the training, the participants were introduced to the practical analysis of TIMSS and PIRLS data, and guided step-by-step through the process. The IEA IDB Analyzer (provided free-of-charge by the course organizers) handles all issues related to the analysis of large-scale assessment data. All examples and practical assignments were tailored to the Moroccan national context. Participants were encouraged to develop their own example research questions, use statistical analysis to answer these questions, and present the results to the other course members, who in turn were encouraged to provide feedback and discuss ideas for further research.

The 10 participants from the National Authority of Evaluation at the Higher Council of Education, Training and Scientific Research in Rabat were inspired by the workshop, and worked individually when required, and as a team when reviewing and providing feedback. All participants ably demonstrated that they had achieved the objective of the hands-on training; after this workshop, they were all able to:

  • Comprehend the conceptual and theoretical underpinnings of IEA studies;
  • Understand the sampling design and data structure of IEA databases, and their implications for secondary analysis;
  • Recognize the necessity of applying specific statistical tools to analyze data from IEA studies;
  • Retrieve IEA databases, international reports, technical documentation and questionnaires from the IEA’s website;
  • Prepare IEA data for analysis using the software provided (the IDB Analyzer);
  • Develop a sound analysis plan, taking into consideration the specific requirements of data, as well as their hypotheses; and
  • Implement basic cross-national analyses of IEA data using the IDB Analyzer, both interpreting and presenting the results.

At the end of the workshop, all participants were positive about the training course, and showed great enthusiasm for using what they had learned to extend their educational research work.  A follow-up training course was proposed, to focus on other educational research topics and consolidate the skills developed during this first course.


In Morocco, the training was provided by Yasin Afana, a researcher at the IEA Hamburg, currently also working on his PhD thesis in association with the University of Leicester (UK). Mr Afana can be contacted at yasin.afana@iea-dpc.de.

Our team of consultants can help you build research skills capacity among the stakeholders in your organization or education system who want or need to participate in international comparative studies or develop national assessments focused on monitoring educational outcomes.

For further information about our training services, please consult our website.

AEA Europe 2016: Let’s talk about assessment!

By Sabine Meinck & David Rutkowski

Why go to Cyprus in November? Well, yes, the weather is good and the Mediterranean Sea is beautiful, but that wasn’t our only reason. In addition to the wonderful weather and food, we enjoyed inspiring talks and critical discussions about educational assessments and their social and political underpinnings at the 17th annual conference of the Association for Educational Assessment – Europe (AEA-Europe). During the conference we learned a great deal about new ways to approach methodological issues, as well as cultural and social differences when assessing achievement. Novel and interesting solutions for computer-based delivery systems were presented, and the conference provided a forum for us to initiate new collaborations and build synergies in various fields.

Obviously, we were also there to contribute to the conference. Here the IEA and the Centre for Educational Measurement at the University of Oslo (CEMO) pooled our capabilities and personal skills to deliver a workshop on how to write a policy brief based on IEA data. We were enthused by the dedication of the workshop participants, especially as a sunny beach was waiting only a few meters away. It was a great pleasure to introduce IEA studies and their challenges, outline the features of a good policy brief, and discuss the challenges and opportunities that may arise from the various national circumstances.

We dearly hope to soon see the briefs arising from this workshop!

 

Dr Sabine Meinck is Head of the Research, Analysis & Sampling Unit at the IEA, and Professor David Rutkowski is editor of IEA’s policy brief series, professor at the University of Oslo and researcher at CEMO. In February 2016, the IEA and CEMO signed an agreement to enhance collaboration between the two organizations and cooperate to promote international educational research.

For more information about IEA Policy Briefs, please go to the IEA website.

TIMSS and PISA results: Seeing past the headlines

Education research does not often make newspaper headlines. Even less often does it make headlines in multiple countries around the world. In just a few weeks time we will see a rare exception.

In the space of a week, the latest set of results from both TIMSS (29th November) and PISA (6th December) – two international surveys that compare the performance of education systems around the world – will be published. The results are a closely guarded secret until then, but two things can be guaranteed: there will be newspaper headlines, and the headlines will be misleading.

I know there will be headlines, because I have just spent the week in Oslo at the General Assembly of the IEA – the organization responsible for TIMSS, together with the TIMSS & PIRLS International Study Center at Boston College. Not only have we had a preview of the results, but we have also spent much of the week discussing how to ensure the results are reported widely and accurately in the media, and are understood and applied by policymakers and the wider sector (my blog from the What Works Global Summit last month discusses exactly this topic).

And I know the headlines will be misleading. Most notably, they will focus on the rankings of countries and will fail to grasp the notion of statistical significance. A country’s ranking may change from year to year simply because the number of countries participating in the study has changed. Everyone knows the old joke about coming second in a beauty contest with only two contestants (rather a different achievement to coming second out of 50!)

And differences between countries, or changes over time, are not always indicative of fundamental differences/changes. In football they say the league table never lies. But at any given point in time, some teams will be punching above or below their weight (to mix metaphors!) simply because of an unusually good or bad run of form. Small differences are sometimes just the result of these blips.

NFER has been involved in the international studies for the past twenty years, so we’re familiar with the good, the bad and the ugly of how they’re reported. Over the coming weeks we will play our part in helping to explain the results and debunk the myths.

Despite the challenges, attending IEA’s General Assembly has emphasized two ways in which the value of international studies will be realized long after the initial flurry of headlines.

Firstly, the data generated goes far beyond measuring overall performance. Student and school questionnaires provide a rich source of information on students’ attitudes and experiences of school; teachers, professional development and school organization. The TIMSS encyclopedia (published in October 2016) also contains a wide range of comparative data on curricula, teacher training, school starting age, and more.

By comparing countries and tracking changes over time, key education issues can be explored in ways that are not possible with any other data source. Indeed, NFER is currently undertaking a project using international data to provide an important new perspective on social mobility in England.

Secondly, the international studies promote cooperation and learning between educators around the world. You will rarely find such a coming together of people from education ministries and research institutions from such a diverse set of countries. The conversations and relationships formed through networks such as the IEA enable genuine dialogue and learning to take place – not just an over-simplified policy borrowing from the top performing nations.

IEA’s strapline is ‘Researching education, improving learning’, which aligns well with NFER’s own ‘Evidence for Excellence in Education’. Those are two headlines I’d be happy to see appearing later this year.

Ben Durbin is Head of International Education at NFER. He can be contacted at b.durbin@nfer.ac.uk

ICME 2016 – A lasting impression

Cabezal_SABINE_MEINCK_blog

By Sabine Meinck

On the 24th – 31st of July 2016, the University of Hamburg hosted the biggest conference on mathematics didactics worldwide: the 13th International Congress on Mathematical Education (ICME), and we had the pleasure to participate! I just love the inspiring atmosphere of international research conferences, with this one certainly having left a lasting impression. The conference attracts more than 3500 participants and takes place once every four years.

My colleagues, Mr Oliver Neuschmidt, Ms Milena Taneva and I were invited to deliver two workshops. We introduced IEA TIMSS and IEA TEDS-M, illustrating how the data from these studies can be used for research relating to the didactics of mathematics. Workshop participants were enthused about the opportunities this data presents for their research interests.

ICME-13 was extraordinarily well structured. With a good variety of keynote speeches, lectures, discussion groups, topic study groups, and workshops, the event succeeded in bringing together participants from all over the world with similar research interests, encouraging critical dialogue and collaboration initiatives. My personal highlights of the conference were: the presentation by Prof. Dr Kristina Reiss (Technical University of Munich), promoting mixed methods using large-scale assessment data; the keynote speech by Dr Vijay Reddy (National Research Coordinator of IEA TIMSS in South Africa), describing the value of TIMSS for enhancing their national education system; the lecture by Prof. Dr Sigrid Blömeke (CEMO) on the extended use of TEDS-M; and last but not least, the reception in our beautiful town hall!

In my view, the conference represented a perfect platform to familiarize international mathematics researchers with IEA mathematics-related studies. I’m already considering attending the next ICME in 2020 – this one sadly not being in my hometown.

Dr Sabine Meinck is Head of the Research, Analysis & Sampling Unit at the IEA.

Exchanging knowledge in Abidjan

What I learned at the 2nd African Ministerial Forum on ICT Integration in Education and Training

On 7 June 2016, I flew to Abidjan in Côte d’Ivoire to attend the 2nd African Ministerial Forum on ICT Integration in Education and Training. There, I was privileged to present results from the IEA’s first International Computer and Information Literacy Study (ICILS 2013) to more than 150 representatives from 20 African countries.

The Forum, a regional policy dialogue initiative, exists to assist African states in better exploiting the potentials of information and communications technology (ICT) to achieve their goals of inclusive and equitable quality education and life-long learning.

Presentations and discussions at the forum clearly revealed an urgent need to improve and transform Africa’s education and training systems through relevant and effective integration of ICT. Delegates discussed how to use ICT to integrate marginalized populations, and ultimately prepare learners for future employment and the technological demands of the 21st century.

Efforts are being made in several areas across all countries, including: integrating ICT in education and training policies and strategies, focusing on leadership strategies for policymakers to create a vision for their education systems, and improving school and classroom contexts.

My impression was that more stress should be put into measuring the outcomes of the implemented programs and policies, not only in building ICT competencies, but also toward identifying how using and teaching ICT can help improve student performance in the future workplace and broader society.

When sharing the results of ICILS 2013, I found that many of the findings of the study were confirmed by experiences in the African context. Here, I would like to stress two key conclusions from ICILS 2013:

  • Providing technical infrastructure is a major challenge in the region, but it is worth remembering that the availability of ICT equipment is, in itself, no guarantee of success.
  • The role of teachers is fundamental to this transformation; they need to be properly trained in the use of new technologies to, in turn, help students become ICT literate.

I invited all African countries to join our next cycle of ICILS 2018. Interested readers can obtain more details at http://www.iea.nl/icils_2018.html, and should note that the field trial will commence in 2017.

Michael Jung

 

The IEA welcomes contributions from all organizations interested in supporting this critical international research endeavor. Organizations and companies interested in exploring options for partnership are invited to contact the IEA to discuss mutual opportunities for collaboration.