ISSN: 2013-2255

Lawson’s Psychological Critical Thinking Exam as an instrument to improve critical thinking

Marta-Natalia Torres-Domínguez a https://orcid.org/0000-0002-1008-2965

Itxaso Barberia a https://orcid.org/0000-0003-3045-2289

Javier Rodríguez-Ferreiro a, b https://orcid.org/0000-0001-9828-8302

Universitat de Barcelona, Spain.

 

a Departament de Cognició, Desenvolupament i Psicologia de l’Educació.

b Contact author: Passeig de la Vall d’Hebron 171, 08035 Barcelona, Spain. rodriguezferreiro@ub.edu

 

Research article. Received: 08/07/2024. Revised: 05/11/2024. Accepted: 18/11/2024. Published: 02/01/2025.

Abstract

INTRODUCTION. Critical thinking is vital in both science and daily life, being crucial for identifying anecdotal evidence, unfalsifiable hypotheses, biased data sampling, and unreliable causal connections. This study aimed to adapt Lawson’s revised version of the Psychological Critical Thinking Exam (PCTE) into Spanish, and to use it to assess improvement in Psychology students taking a module on reasoning.

METHOD. Eighty-seven third-year Psychology students responded to the odd items of the original PCTE at the beginning of the term, and to the even items at the end. During the module, one session was dedicated to discussing Lawson’s critical thinking questions, evaluating the reliability of purportedly scientific statements.

RESULTS. Students’ critical thinking skills improved both in problem detection (i.e., volunteers who initially failed to identify methodological problems, then did) and in accurately detecting the target problem (i.e., issues related to the critical thinking question addressed).

DISCUSSION. We consider the PCTE a valuable tool for assessing and developing critical thinking among university students. Moreover, we suggest that this study itself could serve as didactic material for analysis by students, enhancing their critical thinking skills.

Keywords

critical thinking, PCTE, Spanish adaptation, Lawson.

Recommended reference

Torres-Domínguez, M. N., Barberia, I., & Rodríguez-Ferreiro, J. (2025). Lawson’s Psychological Critical Thinking Exam as an instrument to improve critical thinking. REIRE Revista d’Innovació i Recerca en Educació, 18(1), 1-10. https://doi.org/10.1344/reire.47238

 

© 2025 The authors. This is an open access article distributed under the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. To view a copy of this license, visit

https://creativecommons.org/licenses/by/4.0/

 

Títol (català)

L’Examen de Pensament Crític Psicològic de Lawson com a instrument per millorar el pensament crític

Resum

INTRODUCCIÓ. El pensament crític és vital a la ciència i a la vida quotidiana per identificar evidències anecdòtiques, hipòtesis no falsables, recollida esbiaixada de dades o connexions causals poc fiables. L’objectiu d’aquest estudi va ser adaptar a l’espanyol la versió revisada de Lawson de l’Examen de Pensament Crític Psicològic (PCTE) i aplicar-la per avaluar-ne la millora durant una assignatura de raonament del grau en Psicologia.

MÈTODE. Vuitanta-set estudiants de tercer curs van respondre als ítems imparells del PCTE original a l’inici del quadrimestre, i van resoldre els parells al final. Es va dedicar una sessió a discutir sobre les preguntes de pensament crític de Lawson, i es va avaluar la fiabilitat d’afirmacions suposadament científiques.

RESULTATS. Les habilitats de pensament crític dels estudiants van millorar en la detecció de problemes (els voluntaris que inicialment no van detectar problemes metodològics, després sí que ho van fer) i en la detecció precisa del problema objectiu (qüestions relacionades amb la pregunta de pensament crític).

DISCUSSIÓ. Considerem el PCTE una eina per avaluar i desenvolupar el pensament crític entre els estudiants universitaris. Suggerim que aquest estudi pugui servir com a material didàctic d’anàlisi per als estudiants per potenciar les seves habilitats de pensament crític.

Paraules clau

pensament crític, PCTE, adaptació espanyola, Lawson

Título (castellano)

El Examen de Pensamiento Crítico Psicológico de Lawson como instrumento para mejorar el pensamiento crítico

Resumen

INTRODUCCIÓN. El pensamiento crítico es vital en ciencia y en la vida cotidiana para identificar evidencias anecdóticas, hipótesis no falsables, muestreo sesgado de datos o conexiones causales poco fiables. El objetivo de este estudio fue adaptar al español la versión revisada de Lawson del Examen de Pensamiento Crítico Psicológico (PCTE por sus siglas en inglés), y aplicarlo para evaluar su mejora durante una asignatura de razonamiento del grado de Psicología.

MÉTODO. 87 estudiantes de tercer curso de Psicología respondieron a los ítems impares del PCTE original, al inicio del cuatrimestre, y a los pares al final. Se dedicó una sesión a discutir las preguntas de pensamiento crítico de Lawson, evaluando la fiabilidad de afirmaciones supuestamente científicas.

RESULTADOS. Las habilidades de pensamiento crítico de los estudiantes mejoraron en la detección de problemas (los voluntarios que inicialmente no detectaron problemas metodológicos, luego sí que lo hicieron) y en la detección precisa del problema objetivo (cuestiones relacionadas con la pregunta de pensamiento crítico).

DISCUSIÓN. Consideramos el PCTE una herramienta para evaluar y desarrollar el pensamiento crítico entre los estudiantes universitarios. Además, sugerimos que el propio estudio pueda servir como material didáctico de análisis para los estudiantes, potenciando sus habilidades de pensamiento crítico.

Palabras clave

pensamiento crítico, PCTE, adaptación española, Lawson

 

1 Introduction

Society is going through a series of events that should make us rethink our current lifestyle, including the COVID-19 pandemic and the imminent threat of climate change. In this context, education is crucial for raising awareness of the need to adopt a more sustainable model of society, involving, for example, production, food consumption, and transport models that would significantly reduce our carbon footprint. Such measures could also help reduce the risk of future zoonosis pandemics, for example, through the consumption of proteins other than animal proteins, and/or promoting local commerce. In achieving these goals, the importance of enhancing critical thinking has been highlighted (e.g., Minott et al., 2019; Straková & Cimermanová, 2018; Taimur & Sattar, 2019). Improving critical thinking skills could help society recognize the complex connections between high meat consumption, environmental degradation, and public health risks, including the potential for pandemics. By critically evaluating evidence on issues like factory farming, zoonotic disease transmission, and ecological impacts, people should be able to make more informed choices that may lead to reduced meat consumption and, ultimately, help prevent conditions that facilitate new pandemics.

Critical thinking can be considered as a construct that encompasses not only scientific reasoning (i.e., the ability to generate hypotheses, detect and test evidence, as well as generate theories; Koerber et al., 2015) but also many cognitive processes and dispositions involved in information interpretation (Holyoak & Morrison, 2005). An early definition of this construct proposed by Facione (1990, p. 3) stated that critical thinking is a “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based”. Following Facione, critical thinking includes skills such as interpretation, analysis, evaluation, inference, explanation, and self-regulation, as well as dispositions such as inquisitiveness, open-mindedness, and fair-mindedness.

Critical thinking is a transversal construct that plays a crucial role in different fields of knowledge, such as hard sciences, humanities and social sciences, as well as in solving real-world problems (Dowd et al., 2018; Halpern & Dunn, 2021). This skill allows individuals to recognize unreliable claims based on anecdotal evidence and weak methodological procedures, such as biased data collection, and inference of causality from correlation (Lawson et al., 2015). It is especially relevant for behavioral science, which often has to confront widely held erroneous beliefs, grouped under what has come to be known as “popular psychology”. This term refers to a set of theories, concepts, and practices related to human behavior and mental processes that have become widely known and accepted among the general population. These ideas may originate from scientific research, but they are often simplified, distorted, and presented in a way that is easily digestible by the public (Lilienfeld et al., 2010). While some of these popular psychology theories are supported by empirical evidence, others are not. Unfortunately, there are many pseudoscientific approaches to behavioral analysis that are often presented as legitimate by those who do not have formal training or expertise in the field. These approaches are characterized by a lack of empirical evidence and a failure to adhere to scientific principles, such as falsifiability, replicability, and objectivity (Lilienfeld et al., 2012). Examples of such pseudoscientific approaches present in popular psychology include positive psychology, the use of hypnosis to recall hidden memories of past traumas, personality evaluation through graphology, interpretation of dreams, the use of the polygraph as a reliable lie detector (e.g., Lilienfeld et al., 2010), neuro-linguistic programming (e.g., Witkowski, 2010), and control of behavior through subliminal messages (e.g., Majima, 2015).

It is important to be aware of the limitations of popular psychology and to approach any claims with a critical and skeptical mindset. Lawson (1999, p. 207) defined psychological critical thinking as the ability to

evaluate claims in a way that explicitly incorporates basic principles of psychological science. [...] For example, […] to judge claims based on a lack of empirical evidence, testimonial or anecdotal evidence, unfalsifiable theories, biased samples, or simple correlational data as weak claims.

Both Facione's and Lawson's definitions highlight the importance of providing people with tools that enable them to distinguish between valid information and biased information, thus contributing to the aforementioned goal of educating new generations to guide society towards more efficient and responsible models.

Although critical thinking is often mentioned as a key competency in education (European Commission, 2019; OECD, 2018), it is rarely addressed explicitly in the academic curriculum and few instruments have been developed to measure it directly, particularly in the field of psychology. To our knowledge, the first tool to directly assess critical thinking in psychology students, the Psychological Critical Thinking Exam (PCTE), was designed by Lawson (1999). This tool was subsequently revised to improve its validity and reliability, and to make it more useful for identifying the specific areas of critical thinking in which students have the most deficiencies (Lawson et al., 2015). The PCTE includes items that describe a specific experimental situation from which a conclusion is drawn. The conclusions of each item conflict with one of the seven principles governing critical thinking taken into consideration by Lawson and his collaborators: a) the events or their relationship may occur by chance; b) a control group is needed to compare the performance of the experimental group; c) correlation does not necessarily imply causality; d) the sample must be representative in order to be able to generalize the results; e) the instructions and questions received by participants should avoid biasing their responses; f) theories and hypotheses must be falsifiable; g) it is highly improbable that a single event is the exclusive cause of a complex phenomenon.

During the test, students must identify and explain the problem associated with each item. Then, their responses are quantified to determine their level of critical thinking. Lawson et al. (2015) concluded that the PCTE was a valid and reliable instrument for measuring this construct. Indeed, the PCTE has been recommended as an evaluation tool to assess critical thinking in students studying for a degree in Psychology (American Psychological Association [APA], 2016; Bensley & Murtagh, 2012).

The aim of the present study was to adapt the PCTE into Spanish, and to go beyond its use as an instrument to assess the critical thinking of undergraduate psychology students by exploring its use as a teaching tool aimed at improving critical thinking. Our hypothesis was that use of the tool would enhance critical thinking, which would be reflected in an increase in the scores recorded in the test performed at the end of the term, compared to those obtained at the beginning.

2 Method

2.1 Participants

A total of 87 students from the Universitat de Barcelona participated in this educational intervention. Seventy-seven were women and 10 were men, with ages ranging from 20 to 70 (mean = 22.77, SD = 5.91). The intervention protocols were revised and approved by the ethics committee of the university (Institutional Review Board IRB00003099, Comissió de Bioètica de la Universitat de Barcelona). The intervention took place in regular classes taught as part of the Psychology degree at three points during the semester. All volunteers provided written informed consent for their responses to be used for research purposes.

2.2 Materials

Adapting Lawson’s revised version of the Psychological Critical Thinking Exam (PCTE, Lawson et al., 2015) into Spanish involved a multi-step process to ensure linguistic accuracy and cultural relevance. Initially, we translated the PCTE into Spanish through common translation and back-translation procedures. First, a native Spanish speaker translated the original version into Spanish, then a bilingual native English professional translator back-translated the Spanish version to English. Finally, the small inconsistencies observed between the original and back-translated versions were discussed and resolved by consensus. Subsequently, we took into account the fact that some terms or examples may not translate directly or may lack relevance in Spanish-speaking cultures. We reviewed the test items and adjusted or replaced culturally specific references with examples that are meaningful and/or more correct in actual Spanish-speaking contexts. For example, the original item five states that “Years ago, some psychologists observed that the parents of autistic children appeared very aloof and detached from their autistic children than were parents of normal children” (Lawson et al., 2015, p. 5). We decided to translate this sentence as “Hace unos años, unos psicólogos observaron que los padres de niños con autismo parecían más desapegados de sus hijos que los padres de niños sin dicho trastorno”, as we did not consider qualifying children with autism as “not normal” to be appropriate. In fact, this is the only significant change; the rest of the translation remains largely literal, as the topics covered are, in our view, already well-suited to contemporary Spanish society.

The PCTE is a 14-item questionnaire developed to assess psychological critical thinking. Each item consists of a statement related to psychological phenomena violating one of the basic principles of psychological critical thinking (two items for each principle). Table 1 displays the questions one must ask oneself in order to realize that certain basic critical thinking principles are not being respected in the conclusions of each PCTE item. For example, consider the item

A researcher tested a new drug designed to decrease depression. She gave it to 100 clinically depressed patients and discovered that their average level of depression, as measured by a standardized depression inventory, declined after 4 months of taking the drug. She concluded that the drug reduces depression. (Lawson et al., 2015, p. 4)

This item would be in breach of the basic principle reflected in the second question, since it describes a situation in which there was no control group against which to compare the improvement.

Table 1.

Psychological critical thinking questions described by Lawson et al. (2015).

Questions related to the basic principles of critical thinking

1. Could the event or relationship have occurred by chance (e.g., you just happened to have a car accident on the day that a psychic predicted your car would be damaged)?

2. Is there a control group or comparison group against which to assess the performance of the experimental group? We might see improvement in the experimental group, but would it have occurred anyway without any treatment or intervention (i.e., due to placebo effects, passage of time, regression toward the mean, etc.)?

3. Is the person concluding there is a causal relationship on the basis of correlational data?

4. Is the person trying to generalize the findings to a larger group based on a biased or unrepresentative sample?

5. Did the person ask questions of participants in a biased manner (e.g., leading questions, loaded or emotional wording, or confusing wording)?

6. Has the person made it impossible to falsify his or her theory or hypothesis? Does he or she consider positive evidence as support for the theory and negative evidence as not being relevant? Does he or she claim that the phenomenon disappears once you try to test it?

7. Is the person claiming to have found the cause of some behaviour or phenomenon? Most complex behaviours or phenomena have multiple causes.

 

The Spanish version of the PCTE (PCTE-Sp) and the translated questions related to the critical thinking basic principles are presented as supplementary material at https://tinyurl.com/yt3adn8s. The participants had to identify if there was any problem with the conclusions of each statement and, if any, explain it.

2.3 Procedure

The study was implemented in the context of the “Thinking and problem solving” module, a compulsory module in the third year of the Psychology degree. The module syllabus includes topics such as inductive and deductive thinking, moral reasoning, analogical thinking and creativity. At the beginning of term (i.e., pre-test), all students completed a questionnaire including, among other questions, the odd items of the PCTE-Sp: seven items, each of them corresponding to one of the critical thinking questions. At this point, the students received no explanation regarding the items or the purpose of the questionnaire. During the module, one session was specifically dedicated to scientific thinking and the demarcation problem. During this session, Lawson et al.’s questions were discussed as a way to assess the reliability of purportedly scientific statements. First, the teacher presented and briefly explained the seven questions. Then, the students were asked to score the responses obtained in the pre-test, which were distributed among them after being anonymized. To do this, the students, supervised by the teacher, first discussed in small groups which question was the most relevant for each of the items. This scoring process was conducted for educational purposes only, and was not taken into account for the actual scoring of the study. Finally, at the end of term (i.e., post-test), the students responded to the even items of the PCTE-Sp, each of which also referred to one of the seven critical thinking questions. Both questionnaires were administered through the online platform Qualtrics (http://www.qualtrics.com), and the items were presented in random order for each student. After completing the second questionnaire, it was explained that the items were part of the Psychological Critical Thinking Exam and the students were debriefed about the relationship between the questionnaires and the seven critical thinking questions, which had previously been explained during the session dedicated to scientific thinking. They were also allowed to correct and compare their own pre-test and post-test responses.

Following Lawson et al. (2015, p. 3), responses to the questionnaires were coded by two independent raters according to the following scale:0, no problem identified; 1, a problem recognized but misidentified; 2, identified main problem, but also mentioned less relevant problems; and 3, identified only the main problem”. The minimal discrepancies between the two observers were resolved after discussion between them. Then, the percentage of responses for each dimension of the scale and for each participant was calculated, both in the pre-test and the post-test.

3 Results

The dataset that supports our results is available at https://tinyurl.com/yt3adn8s. The split-half reliability of the PCTE-Sp was calculated using IBM SPSS Statistics (version 25.0.0.0). All remaining analyses were performed using JASP (version 0.16.4.0). We used JASP’s default Cauchy prior width, r = .707, to calculate Bayes Factors (BF10). Bayes Factors are provided as an intuitive interpretation of the robustness of the observed effects, values above 3 being an indicator of evidence favouring the alternative hypothesis.

Regarding the internal consistency of item scores, taking into account the responses to the 14 items of the original questionnaire, a Spearman-Brown test showed a split-half reliability of r = .83, which was comparable to that of .88 obtained by Lawson et al. (2015). Note, however, that McDonald’s ω analyses conducted separately for responses to the odd (pre-test) and even (post-test) items yielded considerably different values of .37 and .8, respectively. This disparity might reflect a reduction in the heterogeneity of the assessed construct between the two measures.

We performed paired t-test analysis to assess the improvement of the participants in the second administration of the PCTE-Sp in relation to the first one. Since the Shapiro–Wilk test revealed that none of the mean percentage scores calculated for each dimension followed a normal distribution (all Ws > 0.69, and all p values < .001), we performed the non-parametric analogous analysis for Student’s t-test, the Wilcoxon’s signed-rank test. Figure 1 shows the percentage of responses for each dimension of the rating scale, both on pre-test and post-test administrations of the questionnaire. The results revealed that students improved in problem detection (i.e., students who did not initially detect any methodological problem, then did), Z = 1195.00, p = .008, rrb1 = 0.40, BF10 = 16.85. They also improved in the accuracy of detection of the target problem: they misidentified fewer problems, Z = 1753.50, p = .006, rrb = 0.38, BF10 = 3.94, and they identified only the main problem more frequently, Z = 535.00, p < .001, rrb = -0.58, BF10 = 878.08.

Figure 1

Mean percentage of responses for each of the dimensions of the rating scale, on both pre-test and post-test administration of the PCTE-Sp. Error bars denote standard error of means. * p < .05, ** p < .01, *** p < .001

4 General discussion

During this study, we adapted the PCTE, originally designed by Lawson et al. (2015), into Spanish, and applied it to students taking a compulsory module in the third year of a Psychology degree. Using a test-retest design, we found a significant improvement in the students’ critical thinking in the test carried out at the end of the term (i.e., post-test). We observed how the proportion of times they detected the violation of one of the basic principles of science increased in relation to the questionnaire answered at the beginning of the module (i.e., pre-test). In addition, we found that they were more accurate in identifying the problem, reducing misidentifications and increasing the proportion of times they focused only on the target problem. This improvement in the results suggests that the test itself could be an effective tool for enhancing critical thinking abilities in undergraduate students. A key aspect in achieving this improvement might be to encourage the students to think critically about the limitations inherent in the studies presented in the test, and to provide a clear explanation of these limitations.

However, our study is not without limitations and we advise caution in interpreting these results. We discuss these potential limitations below by focusing on the seven critical thinking questions suggested by Lawson. In this sense, the study itself could be presented to students for analysis to serve as didactic material for the topic at hand.

  1. 1.Could the event or relationship have occurred by chance?        
    It is unlikely that critical thinking skills improve by chance. Nevertheless, extraneous variables might have influenced the observed results (see also next question). For instance, it might be the case that odd items, used in the pre-test, are easier than even items, used in the post-test. A solution to this problem could be to counterbalance the presentation of item sets in the pre- and post-test among participants. 

  2. 2.Is there a control group or comparison group against which to assess the performance of the experimental group?       
    The main limitation of the study is that it lacks a control group with which to compare the performance of the intervention group. Although critical thinking skills are often considered difficult to acquire, it could have been the case that students improved for other reasons, e.g., having received other content in this or another module, or from exposure to extracurricular sources of knowledge.       
    The solution to this problem could be to randomly assign students to an intervention or a control group in which the session on critical thinking questions is replaced by another session on an unrelated topic.
     

  3. 3.Is the person concluding there is a causal relationship on the basis of correlational data?       
    The study was not correlational, so this question does not directly apply. In any case, regarding a possible causal interpretation of the results of this study, the issue noted in the previous question calls for caution due to the absence of random assignment of participants to experimental and control groups. 

  4. 4.Is the person trying to generalize the findings to a larger group based on a biased or unrepresentative sample?       
    The study was conducted with Psychology students and, hence, the sample was biased both in terms of age, with most of them being young individuals, and sex, since the large majority were women. Under these circumstances, it would be inappropriate to generalize the results to the general population. In any case, the nature of the research, aimed at the study of a tool for the assessment and development of psychological critical thinking, makes its results specifically relevant to the type of population studied. 

  5. 5.Did the person ask questions of participants in a biased manner?       
    The study used a well-established questionnaire, so the assessment tool can be considered to be unbiased. Nevertheless, using only half of the items in each assessment might have negatively affected its reliability. 

  6. 6.Has the person made it impossible to falsify his or her theory or hypothesis?       
    The hypothesis that students would present better critical thinking skills at the post-test compared to the pre-test is falsifiable as it can be contradicted by a lack of differences between scores obtained before and after the intervention. 

  7. 7.Is the person claiming to have found the cause of some behaviour or phenomenon?       
    The study did not aim to discover new underlying causes of behaviour. 

References

American Psychological Association. (2016). Guidelines for the undergraduate psychology major: Version 2.0. American Psychologist, 71(2), 102–111. https://doi.org/gd7fp9

Bensley, D. A., & Murtagh, M. P. (2012). Guidelines for a scientific approach to critical thinking assessment. Teaching of Psychology, 39, 5–16. https://doi.org/gd7fpk

Dowd, J. E., Thompson, R. J., Schiff, L. A., & Reynolds, J. A. (2018). Understanding the complex relationship between critical thinking and science reasoning among undergraduate thesis writers. CBE Life Sciences Education, 17(1), 1–10. https://doi.org/gjtctb

European Commission, Directorate-General for Education, Youth, Sport and Culture. (2019). Key competences for lifelong learning. Publications Office. https://tinyurl.com/yb3tz22y

Facione, P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction – The Delphi Report. The California Academic Press, 423(c), 1–19.  https://tinyurl.com/5yr9644z

Halpern, D. F., & Dunn, D. S. (2021). Critical thinking: A model of intelligence for solving real-world problems. Journal of Intelligence, 9(2), 22. https://doi.org/gkj8mh

Holyoak, K., & Morrison, R. G. (2005). Thinking and reasoning: A reader’s guide. In K.J. Holyoak & R.G. Morrison (Eds.), Cambridge Handbook of Thinking and Reasoning (pp.1–9). Cambridge University Press.  https://doi.org/nsqg

Koerber, S., Mayer, D., Osterhaus, C., Schwippert, K., & Sodian, B. (2015). The Development of Scientific Thinking in Elementary School: A Comprehensive Inventory. Child Development, 86(1), 327–336.  https://doi.org/ghgfbs

Lawson, T. J. (1999). Assessing psychological critical thinking as a learning outcome for psychology majors. Teaching of Psychology, 26(3), 207–209. https://doi.org/d28n8d

Lawson, T. J., Jordan-Fleming, M. K., & Bodle, J. H. (2015). Measuring psychological critical thinking: An update. Teaching of Psychology, 42(3), 248–253. https://doi.org/gd7fp2

Lilienfeld, S. O., Ammirati, R., & David, M. (2012). Distinguishing science from pseudoscience in school psychology: Science and scientific thinking as safeguards against human error. Journal of School Psychology, 50(1), 7–36. https://doi.org/dm76fm

Lilienfeld, S., Lynn, S., Ruscio, J., & Beyerstein, B. (2010). 50 great myths of popular psychology. Wiley-Blackwell Publishing.

Majima, Y. (2015). Belief in Pseudoscience, Cognitive Style and Science Literacy. Applied Cognitive Psychology, 29(4), 552–559. https://doi.org/f7j827

Minott, D., Ferguson, T., & Minott, G. (2019). Critical Thinking and Sustainable Development. In W. Leal Filho (Ed.), Encyclopedia of Sustainability in Higher Education (pp. 1–6). Springer, Cham. https://doi.org/nsqh

OECD. (2018). The Future of Education and Skills: Education 2030. OECD Education Working Papers, 1–23. https://tinyurl.com/yxkecrdu

Straková, Z., & Cimermanová, I. (2018). Critical Thinking Development—A Necessary Step in Higher Education Transformation towards Sustainability. Sustainability, 10(10), 3366. https://doi.org/gfm7jc

Taimur, S., & Sattar, H. (2019). Education for Sustainable Development and Critical Thinking Competency. Encyclopedia of the UN Sustainable Development Goals, 1–11. https://doi.org/nsqj

Witkowski, T. (2010). Thirty-Five Years of Research on Neuro-Linguistic Programming. NLP Research Data Base. State of the Art or Pseudoscientific Decoration? Polish Psychological Bulletin, 41(2).  https://doi.org/bqxrxt

Statements and declarations

the authors declare that they have no significant competing financial, professional, or personal interests that might have influenced the performance or presentation of the work described in this manuscript.

Competing interests and funding

This study was funded by grant PID2019-106102GB-I00 from Ministerio de Ciencia e Innovación/Agencia Estatal de Investigación, (MCIN/AEI/10.13039/501100011033) to JRF.

1 Effect size is given by the rank biserial correlation.