It is hard to find a more salient topic in the news these days than the widespread adoption of digital technologies.  Such rapid adoption necessitates increased digital literacy and competency at all ages.  In order to ascertain how well-prepared middle school students were for the increasingly digital world they would encounter in high school and beyond, the International Association for the Evaluation of Educational Achievement (IEA) launched the International Computer and Information Literacy Study (ICILS). 

ICILS is a digital assessment of computer and information literacy taken by 8th grade students.  It is the only large-scale assessment solely dedicated to measuring computer literacy internationally.  The first assessment was conducted in 2013.  It focused on computer and information literacy (CIL) and measured how well students could use digital devices such as computers to collect, manage, create, and exchange information digitally. 1 

In 2018 IEA added a computational thinking (CT) assessment. This optional assessment measured how well students could “develop algorithmic solutions” to real-world problems that could be operationalized with a computer.2 Think of algorithmic solutions as the step-by-step plan for giving instructions to a computer that will begin with an input and yield a desired output. CT was added in response to ever-evolving digital technologies and ways of working with them that students would need to be familiar with. 

This study continues to respond to the changing digital environment, adding new items in response to changes in students’ technological, social, and educational environments, while keeping a core set of items to allow student results to be compared over time. 

In addition to the assessment components, there is also a student survey of demographic and affective questions, and teacher, principal, and information and communication technology coordinator surveys.  These responses provide a rich context for understanding student proficiency in CIL and CT.

ICILS has been conducted every 5 years since 2013, with preparations underway for ICILS 2028.  A list of countries that participated in 2013, 2018, and 2023 is found at the National Center for Education Statistics website. 

The U.S. participated in ICILS in both 2018 and 2023.  At the school and student level, participation was completely voluntary.  It is to be noted that the U.S. did not meet the international standards for sampling participation rate and its results are reported separately from other participating countries in the international reports published by IEA.

  • In 2018 the U.S. average CIL score was 519, which was higher than the ICILS CIL average (average of the non-benchmarking education systems meeting international technical standards).  The U.S. ranked 5th out of 14 participating education systems.  Education systems include sovereign countries and benchmarking regions within countries. 
  • The U.S. average CT score was 498, not measurably different from the ICILS CT average. In CT, the U.S. ranked 5th out of 9 participating systems.  See the U.S. ICILS 2018 report for details. 
  • In 2023 the U.S. average CIL score was 482, not measurably different from the ICILS CIL average. The U.S. ranked 22nd out of 33 systems. 
  • The U.S. average CT score was 461, which was lower than the ICILS CT average; here the U.S. ranked 17th out of 23 systems.  See the U.S. ICILS 2023 report for details. 

Some systems, including the U.S., saw decreases in ICILS average CIL and CT scores in 2023 compared to 2018 when one might have expected an increase due to greater reliance on computer technology for online K-12 learning during the global pandemic. Table 1 shows the systems that participated in CIL in both 2018 and 2023. Table 2 shows the systems that also participated in CT in 2018 and 2023.

Table 1. Average scores and changes in average scores of 8th-grade students on the Computer and information literacy (CIL) scale, by education system: 2018 and 2023

Education system201820232023-2018
Average scores.e.Average scores.e.Change in average scores.e.
Korea, Republic of15423.15402.5
Denmark1,2553*25182.7-354.4
Portugal25162.65103
Finland531*35073.6-245.4
Germany518!*2.95023.5-165.4
France4992.34982.7
Luxembourg482*0.84942123.6
Italy461*2.84912.6304.7
North Rhine-Westphalia (Germany)2515*2.64854.1-305.7
United States3519*1.94826.6-377.4
Uruguay14504.34473.6
Kazakhstan23955.44073.1

◊ 2023 average score is not measurably different from the 2018 average score.

† Not applicable.

! Interpret data with caution. Estimate is unstable because the standard error is between 30 and 50 percent of the estimate.

* 2018 average score is significantly different (𝘱 < .05) than the 2023 average score.

¹ In 2023, met guidelines for sample participation rates only after replacement schools were included.

² In 2023, national defined population covers 90 to 95 percent of national target population.

³ In 2023, did not meet the guidelines for a sample participation rate of 85 percent.

NOTE: Education systems are ordered by 2023 average score. Benchmarking participants are indicated with italics. Differences were computed using unrounded numbers. In 2018, selected education systems had coverage, sampling, or reliability issues. Denmark met guidelines for sample participation rates only after replacement schools were included. Denmark, Kazakhstan, and Portugal had a national defined population that covered 90 to 95 percent of the national target population. Portugal nearly met guidelines for a sample participation rate of 85 percent after replacement schools were included. Italy collected data at the beginning of the year. The United States did not meet the guidelines for a sample participation rate of 85 percent.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), International Computer and Information Literacy Study (ICILS), 2018 and 2023. Modified reproduction of Table 5. https://nces.ed.gov/surveys/icils/icils2023/international.asp?tabontop.

Table 2. Average scores and changes in average scores of 8th-grade students on the Computational thinking (CT) scale, by education system: 2018 and 2023

Education system201820232023-2018
Average scores.e.Average scores.e.Change in average scores.e.
Korea, Republic of15364.45373.3
Denmark1,2527*2.35043.5-234.9
Finland5083.45025.2
France5012.44993.9
Portugal24822.54844
Germany4863.64793.8
Luxembourg460*0.94762.5163.7
United States3498*2.54617.1-377.9
North Rhine-Westphalia (Germany)2485*34614.1-255.7

◊ 2023 average score is not measurably different from the 2018 average score.

† Not applicable.

* 2018 average score is significantly different (𝘱 < .05) than the 2023 average score.

¹ In 2023, met guidelines for sample participation rates only after replacement schools were included.

² In 2023, national defined population covers 90 to 95 percent of national target population.

³ In 2023, did not meet the guidelines for a sample participation rate of 85 percent.

NOTE: Education systems are ordered by 2023 average score. Benchmarking participants are indicated with italics. Differences were computed using unrounded numbers. In 2018, selected education systems had coverage, sampling, or reliability issues. Denmark met guidelines for sample participation rates only after replacement schools were included. Denmark and Portugal had a national defined population that covered 90 to 95 percent of the national target population. Portugal nearly met guidelines for a sample participation rate of 85 percent after replacement schools were included. The United States did not meet the guidelines for a sample participation rate of 85 percent.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), International Computer and Information Literacy Study (ICILS), 2018 and 2023. Modified reproduction of Table 6. https://nces.ed.gov/surveys/icils/icils2023/international.asp?tabontop.

A number of researchers from participating countries were asked what surprised them about the results because some findings from the study were unexpected.  One ICILS researcher remarked that students were born into an era of digital devices but their ICILS results suggest that they are actually not able to do as much with these devices as expected.3 IEA Executive Director Dirk Hastedt summarized the findings during the 2023 international report release: “It is striking that, in a time of increased exposure to technology and digital information, students in lower-secondary school are actually demonstrating a decreasing ability to use computers in a way that is essential for effective and safe participation in society.”4  The solution to developing more advanced digital literacy skills among students may be “more explicit teaching of digital skills…to encourage development at a rate to match the growing digitalization of society.” 5

-Dirk Hastedt, Executive Director of IEA

The U.S. recognizes the value of advancing digital literacy.  In 2018 and 2024 the U.S. government developed strategic plans to strengthen the foundations for STEM literacy which included CT.6 , 7 The 2024 plan explained the need for students to build additional skills, stating that:

Educational systems are required to rapidly evolve to keep pace with technological changes, but teachers may not be prepared to teach the scientific reasoning and/or digital skills their students need to safely navigate these changes. This issue is increasingly apparent with the rapid developments in AI. Without explicit efforts to develop and refine the foundational skills around data and computational literacy that ultimately undergird cyber and AI literacy, educators and learners will continue to be underprepared to navigate the ways AI and other emerging technologies will change the landscape across all sectors of society (National Science and Technology Council, pp. 19-20).

In early 2025 the U.S. further focused on integrating artificial intelligence (AI) technologies into K-12 education.

To ensure the United States remains a global leader in this technological revolution, we must provide our Nation’s youth with opportunities to cultivate the skills and understanding necessary to use and create the next generation of AI technology.  By fostering AI competency, we will equip our students with the foundational knowledge and skills necessary to adapt to and thrive in an increasingly digital society.  Early learning and exposure to AI concepts not only demystifies this powerful technology but also sparks curiosity and creativity, preparing students to become active and responsible participants in the workforce of the future and nurturing the next generation of American AI innovators to propel our Nation to new heights of scientific and economic achievement.In early 2025 the U.S. further focused on integrating artificial intelligence (AI) technologies into K-12 education.8

ICILS-participating countries in Asia, the Americas, the Middle East, and Europe also recognize the value of building digital literacy and view ICILS results as a significant indicator.  IEA staff report that the European Union is committed to supporting the ICILS study.  ICILS 2023 results were used as performance targets by the European Council and EU Member States.9

While the U.S. is not currently participating in ICILS 2028, at present 35 other education systems expressed interest in the upcoming study.  IEA reports that ICILS is in its item development phase — which will include new AI literacy items — and it is not too late for countries and subnational education systems (i.e., cities, states) to jump in and participate.

The world has and will continue to learn much about student readiness for complex technology in an increasingly digitized world.  One of the useful features of studies like ICILS is that the data from many countries is available to the global research community.  At a time when some of the data collected by the U.S. may not be published, including U.S. ICILS 2023 datasets, those interested in ICILS proficiency measures can find datasets, technical documentation, and software at the IEA data repository

Ultimately countries want a well-educated and highly skilled workforce and citizenry to conduct the business of nation building and to move their countries’ inhabitants forward in terms of peace, prosperity, equity, and longevity. Historically, high-tech jobs have been associated with higher wages10, and this trend is expected to continue although the specific technologies may change and more recent trends are beginning to show wages leveling off due to global competition.  With ICILS’ measures of computer and information literacy and computational thinking, students will be challenged to demonstrate their proficiency in using emerging technologies for school work, activities out of school, and work beyond the classroom.  Given the commitment of the U.S. government to supporting AI education, it might also be useful to support the assessment of digital literacy proficiency, including AI literacy proficiency, through U.S. participation in ICILS 2028. If we don’t measure the progress made, how will we know if what we are doing to build stronger digital literacy is working?


  1. Computer and Information Literacy (CIL) is defined as “an individual’s ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in society.” Fraillon et al. (2013). Preparing for Life in a Digital Age:The IEA International Computer and Information Literacy Study International Report, p. 17. https://link.springer.com/book/10.1007/978-3-319-14222-7 ↩︎
  2. Computational Thinking (CT) is defined as an individual’s ability to recognize aspects of real-world problems which are appropriate for computational formulation and to evaluate and develop algorithmic solutions to those problems so that the solutions could be operationalized with a computer. Fraillon et al. (2019). Preparing for Life in a Digital Age:The IEA International Computer and Information Literacy Study 2018 International Report, p. 27.  https://www.iea.nl/publications/study-reports/international-reports-iea-studies/preparing-life-digital-world ↩︎
  3. Based on transcript of YouTube video entitled ICILS 2018: What is surprising in the results? Jeppe Bundsgaard. https://www.youtube.com/watch?v=6–1nqCiXm4 ↩︎
  4. IEA (2024). IEA Releases Latest Results of the International Computer and Information Literacy Study, ICILS 2023. https://www.iea.nl/sites/default/files/2024-10/ICILS%202023-International-Press-Release.pdf ↩︎
  5. Ibid., p. 2. ↩︎
  6. White House Office of Science and Technology Policy (2018). Charting a Course for Success: America’s Strategy for STEM Education. https://www.energy.gov/articles/charting-course-success-americas-strategy-stem-education ↩︎
  7. National Science and Technology Council (2024). Federal Strategic Plan for Advancing STEM Education and Cultivating STEM Talent.  https://bidenwhitehouse.archives.gov/wp-content/uploads/2024/11/2024fedSTEMplan.pdf ↩︎
  8. White House Presidential Actions (2025). Advancing Artificial Intelligence Education for American Youth. https://www.whitehouse.gov/presidential-actions/2025/04/advancing-artificial-intelligence-education-for-american-youth/ ↩︎
  9. IEA. (n.d.). ICILS 2023. https://www.iea.nl/studies/iea/icils/2023#section-416 ↩︎
  10. Brian Roberts and Michael Wolf, “High-tech industries: an analysis of employment, wages, and output,” Beyond the Numbers: Employment & Unemployment, vol. 7, no. 7 (U.S. Bureau of Labor Statistics, May 2018). https://www.bls.gov/opub/btn/volume-7/high-tech-industries-an-analysis-of-employment-wages-and-output.htm ↩︎


Leave a Reply

Your email address will not be published. Required fields are marked *