Call for proposals on systemic reviews on computer science education
I met Jeff Froyd at the MSU Workshop in Integrated Engineering Education, and he asked me to share this call for a special issue of IEEE Transactions on Education. The whole notion of a “systemic review” is pretty interesting, and relates to the Blog@CACM post I wrote recently. His call has detailed and interesting references at the bottom.
Request for Proposals
2015 Special Issue on Systematic Reviews
The IEEE Transactions on Education solicits proposals for a special issue of systematic reviews on education in electrical engineering, computer engineering, computer science, software engineering, and other fields within the scope of interest of IEEE to be published in 2015. The deadline for 2,000‐word proposals is 9 September 2013. Proposals should be emailed as PDF documents to the Editor‐in‐Chief, Jeffrey E. Froyd, at firstname.lastname@example.org. Questions about proposals should be directed to the Editor‐in‐ Chief, Jeffrey E. Froyd, at email@example.com.
Special Issue Timeline
- 9 September 2013: Interested interdisciplinary, global teams of authors should submit proposals for full papers by 9 September 2013.
- 14 October 2013: The editorial team for the special issue will review proposals and notify authors of the status of their submission by 14 October 2013.
- 31 December 2014: For proposals that are accepted, the authors will be asked to prepare manuscripts that will go through the standard review process for the IEEE Transactions on Education in the Scholarship of Integration. Completed draft manuscripts will be due on 31 December 2014. Papers are expected to be between 8000‐10,000 words in length.
- Xxx – 31 December 2014: Plan (timeline, milestones, activities…) will be collaboratively developed to support manuscript completion by 31 December 2014. Steps in the process of preparing a systematic review include: (i) establishing the research questions, (ii) selecting the databases to be searched and the search strings, (iii), establishing inclusion/exclusion criteria, (iv) selecting articles to be studied, etc. Meetings, in‐person or virtual, will be scheduled to provide support for systematic review methodologies. Meetings will be intended to help develop systematic review expertise across the teams and to improve quality of published systematic reviews.
- 2015: Manuscripts accepted for publication are expected to be published in 2015.
Proposals for systematic review manuscripts must provide the following sections:
- (i) Contact information and institutional affiliation of the lead author
- (ii) An initial list of the team members who will prepare the systematic review, indicating howthese team members provide requisite expertise and global representation. Given the requirements for systematic reviews, it is expected that a qualified, interdisciplinary team will include one or more individuals with expertise in library sciences, one or more individuals with expertise in synthesizing methodologies (qualitative, quantitative, mixed method, or combinations of the three), and one or more individuals with domain expertisein the proposed content area. Given the need to promote global community in the fields in which ToE publishes, it is expected that a qualified team will represent the diverse global regions that comprise the IEEE.
- (iii) Description of the proposed content area, why a systematic review of education in the proposed content area is timely, why a systematic review will enhance development of the field, and how future initiatives might build on the systematic review.
- (iv) Initial description of the proposed systematic review methodology. The project will provide support to promote development of systematic review methodology across all participating teams. However, demonstration of initial familiarity with systematic review methodology will strengthen a proposal.
Brief Overview of Systematic Review Methodology
Diverse fields are developing systematic review, a study of primary (and other) studies to address a crafted set of questions, as a research methodology in and of itself. With risks of considerable oversimplification, systematic review methodology rests on two basic ideas. First, interdisciplinary systematic review teams can use large databases of journals, conference proceedings, and grey literature that have been constructed to search the literature using keywords. Then, the team systematically evaluates returned items using explicit criteria to identify the set of articles that will be reviewed. The first basic idea provides a transparent, unbiased, replicable process to identify relevant articles. Second, teams can apply synthesizing methodologies that have been developed in the last 50 years to extract trends, patterns, themes, relationships, gaps… from the identified set of articles. Synthesizing methodologies draw from a wide variety of quantitative (e.g., statistical meta‐analysis, network meta‐analysis), qualitative (e.g., meta‐ethnography, content analysis), mixed method approaches, and combinations of the three. Systematic, transparent use of literature search and synthesizing methodologies can produce systematic reviews of the literature that may be seminal contributions to the community that has created the literature. Good introductions to systematic reviews can be found at:
- Texas A&M University Libraries, Research Guides, Systematic Reviews,
- Oxford LibGuides, Subject resources. Information Skills. Research Guides. Systematic Reviews, http://ox.libguides.com/systematic‐reviews
- University of Toronto, Gerstein Science Information Centre, Systematic Reviews in the Sciences & Health Sciences, http://guides.library.utoronto.ca/systematicreviews
ToE has already established review criteria for the scholarship of integration, the area addressed by the proposed special issue. These review criteria can be found at http://sites.ieee.org/review‐criteria‐toe/.
This section offers examples of systematic reviews that have been done in STEM education. Generally, topics for these examples are outside topical areas that would be considered for the IEEE Transactions on Education, but they show examples of good practices for some steps in systematic reviews.
L. Springer, M. E. Stanne and S. S. Donovan, “Effects of small‐group learning on undergraduates in science, mathematics, engineering, and technology: A meta‐analysis.” Review of Educational Research, vol. 69, no. 1, pp. 21‐51. 1999 (doi: 10.3102/00346543069001021)
F. B. V. Benitti, “Exploring the educational potential of robotics in schools: A systematic review,” Comput. & Educ., vol. 58, no. 3, pp. 978‐988, 2012
N. Meese, and C. McMahon, ”Knowledge sharing for sustainable development in civil engineering: A systematic review,” AI and Soc., vol. 27, no. 4, pp. 437‐449, 2012
N. Salleh, E. Mendes, Emilia, and J. Grundy, “Empirical studies of pair programming for CS/SE teaching in higher education: A systematic literature review,” IEEE Trans. Softw. Eng., vol. 37, no. 4, pp. 509‐525, 2011
R. M. Tamim, R. M. Bernard, E. Borokhovski, P. C. Abrami, and R. F. Schmid, “What forty years of research says about the impact of technology on learning: A second‐order meta‐analysis and validation study,” Review of Educ. Research, vol. 81, no. 1, pp. 4‐28, 2011
These resources provide guides to systematic review methodologies:
E. Barnett‐Page, and J. Thomas, “Methods for the synthesis of qualitative research: A critical review,” BMC Medical Research Methodology, vol. 9, no. 1, p. 59, 2009
Background: In recent years, a growing number of methods for synthesising qualitative research have emerged, particularly in relation to health‐related research. There is a need for both researchers and commissioners to be able to distinguish between these methods and to select which method is the most appropriate to their situation.
Discussion: A number of methodological and conceptual links between these methods were identified and explored, while contrasting epistemological positions explained differences in approaches to issues such as quality assessment and extent of iteration. Methods broadly fall into ‘realist’ or ‘idealist’ epistemologies, which partly accounts for these differences.
Summary: Methods for qualitative synthesis vary across a range of dimensions. Commissioners of qualitative syntheses might wish to consider the kind of product they want and select their method – or type of method – accordingly.
M. Borrego, , E.P. Douglas and C.T. Amelink, “Quantitative, qualitative, and mixed research methods in engineering education“ Journal of Eng. Educ., vol. 98, no. 1, pp. 53‐66, 2009
Abstract: The purpose of this research review is to open dialog about quantitative, qualitative, and mixed research methods in engineering education research. Our position is that no particular method is privileged over any other. Rather, the choice must be driven by the research questions. For each approach we offer a definition, aims, appropriate research questions, evaluation criteria, and examples from the Journal of Engineering Education. Then, we present empirical results from a prestigious international conference on engineering education research. Participants expressed disappointment in the low representation of qualitative studies; nonetheless, there appeared to be a strong preference for quantitative methods, particularly classroom‐based experiments. Given the wide variety of issues still to be explored within engineering education, we expect that quantitative, qualitative, and mixed approaches will be essential in the future. We encourage readers to further investigate alternate research methods by accessing some of our sources and collaborating across education/social science and engineering disciplinary boundaries.
D.A. Cook and C.P. West, “Conducting systematic reviews in medical education: a stepwise approach,” Medical Education, vol.46, pp. 943‐952, 2012
Objectives: As medical education research continues to proliferate, evidence syntheses will become increasingly important. The purpose of this article is to provide a concise and practical guide to the conduct and reporting of systematic reviews.
Results: (i) Define a focused question addressing the population, intervention, comparison (if any) and outcomes. (ii) Evaluate whether a systematic review is appropriate to answer the question. Systematic and non‐ systematic approaches are complementary; the former summarise research on focused topics and highlight strengths and weaknesses in existing bodies of evidence, whereas the latter integrate research from diverse fields and identify new insights. (iii) Assemble a team and write a study protocol. (iv) Search for eligible studies using multiple databases (MEDLINE alone is insufficient) and other resources (article reference lists, author files, content experts). Expert assistance is helpful. (v) Decide on the inclusion or exclusion of each identified study, ideally in duplicate, using explicitly defined criteria. (vi) Abstract key information (including on study design, participants, intervention and comparison features, and outcomes) for each included article, ideally in duplicate. (vii) Analyse and synthesise the results by narrative or quantitative pooling, investigating heterogeneity, and exploring the validity and assumptions of the review itself. In addition to the seven key steps, the authors provide information on electronic tools to facilitate the review process, practical tips to facilitate the reporting process and an annotated bibliography.
M. Petticrew and H. Roberts, Systematic Reviews in the Social Sciences: A Practical Guide. Malden, MA: Blackwell Publishing, 2006
A. C. Tricco, J. Tetzlaff and D. Moher, “The art and science of knowledge synthesis,” Journal of Clinical Epidemiology, vol. 64, no. 1, pp. 11‐20, 2011
Objectives: To review methods for completing knowledge synthesis.
Study Design and Setting: We discuss how to complete a broad range of knowledge syntheses. Our article is intended as an introductory guide.
Results: Many groups worldwide conduct knowledge syntheses, and some methods are applicable to most reviews. However, variations of these methods are apparent for different types of reviews, such as realist reviews and mixed‐model reviews. Review validity is dependent on the validity of the included primary studies and the review process itself. Steps should be taken to avoid bias in the conduct of knowledge synthesis. Transparency in reporting will help readers assess review validity and applicability, increasing its utility.
Conclusion: Given the magnitude of the literature, the increasing demands on knowledge syntheses teams, and the diversity of approaches, continuing efforts will be important to increase the efficiency, validity, and applicability of systematic reviews. Future research should focus on increasing the uptake of knowledge synthesis, how best to update reviews, the comparability between different types of reviews (eg, rapid vs. comprehensive reviews), and how to prioritize knowledge synthesis topics.