How did Latin America do in the Pisa tests?

The OECD applies this test every three years to measure reading, science and math skills. Here we tell you how Latin America did.

Student taking an exam. /

Student taking an exam. / Photo: Rawpixel - Reference Image

LatinAmerican Post | Marcela Peñaloza

Escucha este artículo

Leer en español: ¿Cómo le fue a Latinoamérica en las pruebas Pisa?

Each country measures the quality of its education through tests that evaluate its students and the education system. The OECD, Organization for Economic Cooperation and Development, has its own version to measure education in the world with the Program for International Student Assessment (PISA). The results of the version that took place in 2018 were announced on December 2. Here we tell you how Latin America did.

Reading, science, and math: students are below average

Unfortunately, no country in the region was in the top 10 places in the ranking. In fact, the first country to appear ranks 43 and is Chile. In the three subjects evaluated, the Latin countries obtained a lower than average score, which is 480 points.

Regarding reading, the results show that all countries, which appear in the list below, are below the average of the OECD countries. According to the test, the region has a serious reading comprehension problem.

According to the BBC, the OECD explains that "one in four students from the 36 member countries of the organization cannot complete the most basic reading tasks." The situation would be even more worrying in developing countries, where quality levels are lower.

43. Chile

48. Uruguay

49. Costa Rica

53. Mexico

57. Brazil

58. Colombia

63. Argentina

64. Peru

71. Panama

76. Dominican Republic

Also read: Why women select college majors with lower earnings potential

In science, the ranking does not change much in terms of occupied positions. Chile tops the list again and is followed by Uruguay.

45. Chile

54. Uruguay

57. Mexico

60. Costa Rica

62. Colombia

64. Brazil

65. Argentina

66. Peru

76. Panama

78. Dominican Republic

Finally, in mathematics, Uruguay surpassed Chile and the countries were in the following positions:

58. Uruguay

59. Chile

61. Mexico

63. Costa Rica

64. Peru

69. Colombia

70. Brazil

71. Argentina

76. Panama

78. Dominican Republic

Read also: Young children can learn math skills from intelligent virtual characters

The results of the three categories evaluated demonstrate a major flaw in the educational systems of Latin America. The OECD emphasizes that the lack of reading comprehension limits the opportunities of the students because when they do not understand what they are being asked for or the instructions they are receiving textually, they will not be able to carry out these activities successfully.

Which countries did the Pisa tests do better?

On this occasion, China took first, in the three competitions evaluated. In the second position, also in all three skills, Singapore was located. Finally, the third position in reading, science, and mathematics was taken by Macao.

From fourth to tenth place, different countries are located in each of the subjects evaluated. Among them is Hong Kong, Estonia, Finland, Canada, Ireland, South Korea, Taiwan, Japan, the Netherlands, among others.

Criticisms of the Pisa tests

Although this test gives an idea about the state of the educational system of the countries it measures, educators have pointed out that this test has flaws that would be biasing the results in terms of quality and leaving aside other competencies to evaluate how the artistic ones.

The BBC explains that the test has four deficiencies that would be encouraging the copy of successful educational models without taking into account what happens in the socio-cultural context of the other countries.

Also read: Quitting Facebook could boost exam results

The first deficiency is that the statements of the questions are not written so that the student can give a satisfactory answer that accounts for the knowledge on the subject in question. The second flaw is that there is a cultural bias that prevents the results from being measured well and has not been eliminated. For example, in 2012, it was identified that there was a different understanding of mathematical concepts depending on the region where the question was asked.

Third, there are often frequent changes in the questionnaires and the monitoring of the results, which creates contradictions. The BBC explains that "changes in the statements and definitions of the indices are frequent, without explaining the cause." The changes create doubts about Pisa measures and how it is quantified. Finally, Pisa has been pointed out of not knowing how to respond to unexpected results. According to the same means, given results that were not expected, Pisa responds with superficial explanations without well-founded arguments.