Tag: ONL252 and/or

Some notes on literacy

We often discuss online participation in terms of digital literacy. By focusing on “literacy”, online participation becomes an individual problem. If you are unable to participate in an online context, it is seen as if you have some form of problem with your literacy. You may not be sufficiently native and fluent. You may not understand how to use the tools or understand the content that is available to you, struggling even being in the “visitors mode”. By the way, I think we constantly shift between visitors and residents’ mode, so I don’t like those terms being at one end of a continuum. That relation is messier than allowing itself to be put on a line. I think.

If focusing on digital literacy we also have to move back to the original meaning of “literacy” as the ability to read and write text. First, you need literacy to be able to have digital literacy (or “problem-solving” skills in digitally rich environments as OECD puts it). 

Poorly designed tools require high literacy

But the strong focus on literacy risks missing the question of how we design the tools that you need to master in order to be digitally literate. If these tools are poorly designed, if they are illogical, complicated and perhaps even exclude use for some people (as is often the case with blind people or people with intellectual disabilities just to mention a few), then it places much higher demands on literacy than if they were well designed, simple to use and accessible. Focusing on the flaws in the design of our tools also shifts the responsibility from the individual (if you can’t do this, you need to increase your literacy) to the system.

By focusing on design, we shift the responsibility for problems related to digital participation from the individual to the surrounding society. Why do we provide tools that are poorly designed and exclude some? If we designed better tools, maybe people with lower literacy levels could also participate? How many did a fast exit from the ONL-course due to issues understanding the course web page? About half of the enrolled students in one of our online courses at Lund university don’t make it through the first three weeks of the course. There are several reasons for that (the most common is that they got a place at another course). But how many left because of Moodle? We don’t know. I sort of want to kill myself every time I have to go into the admin and editing mode…)

With hig demands on literacy there is a risk of excluding some learners from learning. With easier tools we could lower the “literacy bar”. It could be interesting to figure out how large proportion of a population that can be regarded having a high level of literacy and presumably easy can take on online learning even if the tools are poorly designed.

State of the art literacy in the OECD

The OECD and the study PIAAC – Problem Solving in Technology-Rich Environments measures both problem-solving and basic computer literacy skills, conducted in a digital environment were various performance tasks measure participants’ use of ICT applications. The PIIAC is targeting the whole population and there is another survey targeting younger students called PISA.

The results from Sweden shows that although the country is presented as being in the top almost 70 percent of the population scores relatively low on adaptive problem solving (where the digital literacy comes in)

A diagram showing the situation on literacy in Sweden

AI-genererat innehåll kan vara felaktigt.

Figure 1: OECD (2024), Do Adults Have the Skills They Need to Thrive in a Changing World?: Survey of Adult Skills 2023, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/b263dc5d-en.

 To compare with some other countries and the average OECD result we can assume that the share of the population scoring below 2 or lower is higher in many countries but probably a bit lower in Finland and Japan.

Diagram. OECD on literacy levels in several countries

AI-genererat innehåll kan vara felaktigt.

Figure 2: OECD (2024), Do Adults Have the Skills They Need to Thrive in a Changing World?: Survey of Adult Skills 2023, OECD Skills Studies, OECD Publishing, Paris, https://doi.org/10.1787/b263dc5d-en.

The adaptive problem-solving scale has five levels: Below Level 1 and Levels 1-4 (scoring 0-500 points).

Adaptive problem solving is defined as the ability to achieve one’s goals in a dynamic situation in which a method for solution is not immediately available, requiring engaging in cognitive and metacognitive processes to define the problem, search for information, and apply a solution in a variety of information environments and contexts. (Source: https://www.oecd.org/en/topics/digital-skills.html)

The assessment involves three overarching cognitive processes:

Definition: This involves selecting, organizing and integrating problem information into a mental model; retrieving relevant background information; and the ability to externalize the problem’s main features, with metacognitive processes including goal setting and monitoring problem comprehension.

Searching: This involves searching for operators in the environment (locating information about available actions that might solve the problem) and evaluating how well operators satisfy the problem constraints. (Source: https://www.oecd.org/en/topics/digital-skills.html)

Application: This is when the problem solver applies plans to solve a problem and executes the specified operators, with metacognitive processes involving monitoring progress, taking action if the problem changes or progress has stalled, and reflection.

I would argue that the platforms and tools that we offer our learners often require a proficiency level being at the upper end of level 3 or at level 4 on both literacy and problem solving. By setting the bar this high, we actually “design away” many potential learners that would have had the intellectual capacity to learn but don’t cope with our tools.