Cross-disciplinary research: where next?
Dr Judy Whitmarsh,.
Paper presented at the British Educational Research Association Annual Conference, University of Warwick, 6-9 September 2006
Abstract:
Current political initiatives, such as the Children Act 2004 and the Every Child Matters agenda, support the joining-up of children’s services into multi-agency teams that may contribute to multi-disciplinary research. This paper engages with a critical debate about the dissemination of multi-disciplinary research findings and the relevance of small-scale research to everyday practice.
This raises two specific issues: firstly, how practicable is it to try to contextualise doctoral or master’s research findings and academic argument in a short presentation and, if we do so, do we risk ‘dumbing down’ to the audience? Secondly, when qualitative findings are presented to an audience, whose research tradition may be rooted in the scientific, bio-medical model, how do we overcome the ‘paradigm gap’?
The paper does not seek to resolve the tensions within interdisciplinary research, rather to explore the differences between methodological paradigms that may contribute to these tensions.
Background:
The Children Act (2004) and the Every Child Matters (DfES 2004) documentation have triggered an explosion of interest and research into multi-agency working (see for example, Jones 2000; Anning 2001; Milbourne et al., 2003; Dahl and Aubrey 2004; Dahl et al., 2005); however is anyone addressing the practicalities of the process of conducting multi-agency or multi-disciplinary research, that is, researching the research?
The five outcomes of Every Child Matters (DfES 2004) require education providers, from day-care through to nursery, primary and secondary and extended schools, to consider children’s need to be healthy, stay safe, enjoy and achieve, make a positive contribution, and achieve economic wellbeing. There are many initiatives in place that are finding innovative ways to work towards achieving the five outcomes (see for example, DfES 2006).
Ofsted recognition of the ways in which education and educare providers are working towards fulfilling the outcomes may be sufficient attainment for some, but research may better evaluate how, or even if, good practice is genuinely improving the lives of children, young people and their carers and families. As educational providers move to join up with other services, such as health and social care, the potential for cross-disciplinary research becomes greater; however this raises questions, not just about conducting the research, but also its dissemination. If research crosses boundaries, who will listen to its findings and moreover, how should these findings be presented?
I approach these issues as a senior lecturer and member of the early childhood studies team in a university school of education. My personal doctoral research was located in a Sure Start programme, in the Midlands, and explored how first-time mothers understand and promote infant speech and language development; underpinning the thesis is a theoretical argument about issues of social justice and ‘good’ mothering. The research study and thesis are multi-disciplinary, both by subject matter and by location in a Sure Start programme: child speech and language development crosses borders of education (literacy and oracy), health (speech and language therapy, dental and medical research, child health) and social care (mothering, parenting, family interventions). The Sure Start team who advised me for the background to the research consisted of health visitors, speech and language therapists, educationalists, a child psychologist, child care workers attached to the programme, and attending Sure Start parents. Thus the research engaged with disciplines of education, health and social care.
At the university, I teach on a number of undergraduate modules and supervise MA students: the emphasis of the undergraduate early childhood studies degree course is multi-disciplinary and students take both core and elective subjects with a focus on education, health or social work studies. They also complete a 3rd year dissertation that sometimes explores a multi-disciplinary theme, such as nutrition in schools, the effects of physical exercise or the care of looked-after children. The supervised higher degree students come from a variety of community or clinical nursing backgrounds but the focus of their dissertation relates to their educational role: some are community practice tutors, others may be health visitors with a role teaching parents or colleagues, yet others teach within nursing degree courses. Both my personal research and current work practices therefore involve crossing disciplinary boundaries and it is this process that has triggered the concerns raised in this paper.
Introduction:
The discussion of the methodology of educational research is echoed in the British Educational Research Journal: Whitty (2006) notes that, in the 1990s, a series of “…seemingly damning, albeit sometimes contradictory, criticisms…” have been taken up by politicians and the media to suggest that educational research in the UK is of poor quality, lacks rigour, and is theoretically incoherent, amongst other complaints. The ensuing academic debate appears to concentrate on two opposing viewpoints: on the one hand we have, for example, Lather (2004, p768) claiming that the US government is moving towards the “…federal legislating of scientific method” to the detriment of educational research. Hodkinson (2004, p23), like Lather, argues that the UK is moving towards an audit culture, thus towards a research culture that reifies empiricism, performativity, and a “new orthodoxy” of scientific methods. Clark (2005, p289) challenges the notion of educational research as “…the scientific investigation of the causes of ‘effective’ teaching”, arguing that values and philosophic inquiry are central to educational research, while empirical investigation is merely secondary. Smeyers (2005, p170) raises “serious doubts about the appropriate paradigm to be used for educational research…” suggesting that educational researchers must think deeply and critically about the method to be used and must be able to make the case for their particular approach.
On the other hand, we have those such as Moore et al. (2005) promoting the application of the random controlled trial (RCT), even in such an unlikely subject as sex education; furthermore, Moore et al. (2003, p687) argue that there is “a lack of research capacity, most notably experience and skills” and a paucity of educational RCTs in the UK, leading them to suggest that this could be enhanced by learning from the experience of health education interventions. Styles (2006, p7) identifies the merits of “scientific method and the randomised trial”, suggesting that “[I]t is the duty of the educational evaluation research community, therefore, to lobby government to invest in proper educational trials” (p9). Moreover, Gorard et al. (2004) claim that educational research should draw on medicine to consider a two-stage experimental design, the development of “research syntheses” (p582) based on the Cochrane/Campbell collaboration (see following paragraph), and more complex research design that includes stages of theory, modelling, exploratory trial, RCT, and long-term implementation. Oakley (2000), although claiming otherwise, appears to have performed a complete volte-face from her qualitative, feminist days to complete a book advocating the use of RCTs and more methodological objectivity in research.
So what are the implications for multi-disciplinary research? The following section briefly sets out my position.
Taking a stance:
If we accept that performativity, target setting and an audit culture are here to stay, we can see how government policy therefore requires objective, empirical research to investigate its effectiveness. Evidence-based medicine is well established, with the National Institute for Clinical Effectiveness and the Cochrane/Campbell collaboration evaluating the strength of clinical interventions and providing systematic reviews. Random controlled trials are the existing gold standard of health related disciplines, yet Torgerson et al. (2005) demonstrate many methodological weaknesses and flaws in health care trials over the last 20 years. They also note how, while health care trials appear to be improving in quality over time, the perceived quality of educational RCTs is diminishing. Oancea (2005) observes the criticisms in the medical press medical research itself and the Cochrane Collaboration approach.
Educational research is already moving in the direction of medical research practice: critics of the quality of educational research are already applying medicalised language, using terms such as ‘body of knowledge’, ‘research-based profession’, ‘evidence-based policy and practice’ (Oancea 2005). In line with the Cochrane/Campbell collaboration, EPPI (the Evidence for Policy and Practice Information and Co-ordinating Centre) has been developed to collate systematic reviews of effective practice and provide an evidence base. As noted (see p2) there is an increasing political focus on empiricism, RCTs and measuring effective practice. The focus on quantitative research methods and scientific objectivity in medicine is becoming central to government policy and research practice in education as political quick fixes to educational problems are sought.
Implications:
This has huge implications for multi-disciplinary research, which frequently involves education researchers working with health professionals and social scientists to evaluate joined-up working. Firstly, if empirical research is favoured by policy-makers and publishers, then researchers may choose a quantitative design to obtain funding and with a thought to potential future publication.
Obtaining funding for such large projects as EPPE, (The Effective Provision of Pre-School Education) takes time, resources, and considerable expertise. Pring (2004, p164) notes:
…there are very few places that can orchestrate research on a large scale, within interdisciplinary, well-funded communities. Too often, therefore the big questions (within which small-scale but vital case studies take on a more universal significance) do not get asked…”
Much early years and education research is small-scale and qualitative in its methodology; most research conducted for higher education degrees is also small-scale, frequently qualitative and focusing on a single case or setting. If the RCT becomes the gold standard, where does that leave the many researchers who do not have access to the resources to conduct large-scale quantitative trials? Moreover, if research is to be truly multi-disciplinary, will those disciplines traditionally rooted within a quantitative paradigm listen to the findings that emerge from a more qualitative paradigm and vice versa? What validity will a medico-scientific researcher give to the findings from an ethnographic feminist, sociological study, for example? If funded educational research is moving towards a gold-standard of positivism, how should university supervisors advise the Master’s student wishing to conduct a small, qualitative research study?
While admitting that my own comfort zone is within the qualitative paradigm, I raise these issues from both personal experience and a concern that educational research may move from its current flexible position, accepting and debating the advantages and disadvantages of both paradigms, towards a more bio-medical model. Studies of childhood are, by their very nature, frequently multi-disciplinary and with the advent of Every Child Matters, likely to become increasingly orientated towards crossing boundaries; this suggests that much future research will be conducted with colleagues from health, social care and other professions. At this point in the paper, I would like to use the concept of ethical consent to demonstrate some of the current and potential issues and dilemmas for multi-disciplinary research.
Ethical consent for research:
Ethics is, as Richard Smith notes (2006) “a burgeoning academic field” and research with children is a subject in its own right (James et al. 1998). While much education research is subject only to consent from a university ethics committee, most health research that relates to the National Health Service (NHS) requires ethical consent from a Local Research Ethics Committee (Lrec), administered by a central office for health research (Corec); thus, any research requiring NHS data, interviews with NHS staff, patients, relatives or carers, is currently subject to Corec requirements. A similar level of ethical consent is therefore required by Corec, whether it is for a large-scale pharmaceutical trial or a small piece of research with adult colleagues for a Master’s study. Paradoxically, while universities are introducing, standardising and tightening up their ethics committees, Corec is currently consulting on how to introduce more flexibility into its ethics consent process in order to manage the small-scale educational research by which it is besieged. The NHS process is creating a lack of parity between higher degree students in the same school of education, at the same university: while students conducting research among teacher colleagues, for example, go through a usually relatively pain-free process of requesting ethical consent from a generally supportive school/university committee, their NHS colleagues endure a lengthy process that includes submitting a daunting electronic form and presenting themselves before a panel of about twenty consultants, statisticians, nursing colleagues and lay members of the local committee, in order to defend it. This adversarial ordeal, and remember these are novice researchers, may include describing statistical methods in detail, trying to explain a qualitative methodology to a mainly scientific, bio-medical panel, having every word of their participant information sheets and consent forms scrutinised and frequently altered, sometimes being asked to rewrite their research questions and their proforma to cohere with the particular interests of the panel. It does not end here: even if consent if given, any deviation from the submitted proforma requires a return to the panel for additional permission. Local committees vary in their attitude to the researcher, some attempting to be supportive in the process but many appear determined to make the experience as adversarial as possible.
The nursing, medical and health press has long debated the role of Lrec, suggesting that “…qualitative research is being treated unfairly and disadvantaged by ethics committees” (Ramcharan and Cutliffe 2001, p358), there is a lack of committee members versed in qualitative methodology (Parahoo 2003), there is disagreement between the committee, the researcher and members of the public about the priorities and function of the committee (Kent 1997).
This debate about the role of Lrec should ring alarm bells for education researchers and remind them that all in the biomedical ethics world is not necessarily rosy. Moreover, the Lrec requirements must be seen as a warning to those wishing to research across disciplines. However, before education launches itself along a similar pathway, we also need to ask ourselves why the NHS requires such stringent controls over research; the answer perhaps lies in the administrative title of the Local Research Ethics Committee. The committee is designed, not just to weigh up how ethical the proposed research may be, but also to assess the scientificity of the design of the proposal and herein may lie the conundrum. A committee that evaluates research design by how scientific it appears to be, may not be a committee that will approve ethnography, feminist research, grounded theory, and radical qualitative methodologies. This will affect the type of proposal given ethical consent and also affect any cross-disciplinary research that involves gaining Lrec consent. Furthermore, does this imply that an audience listening to findings from Lrec approved research have an expectation of quantitative or quantifiable findings? In my personal experience, this is the case: an audience for some of my doctoral findings, composed mainly of health visitors and nurses, found accepting qualitative findings difficult and pressed for larger scale research and statistical data. The audience appeared more concerned about addressing the methodology than the findings. In addition, trying to condense the qualitative, feminist perspective of the thesis, in order to give sufficient background to the findings, ran the risk of dumbing down for an audience more used to scientific data. It was as if I were speaking a different language, which may well have been the case but, if this is so, where do we go from here? Is there such a thing as true cross-disciplinary research and, if so, who is it aimed at and who will listen to its findings?
James et al. (1998, p199) suggest that conjoining of disciplines is the analytical way forwards “…to add to the total mosaic of our knowledge about children and childhood…” yet, as this paper demonstrates, this is not without its tensions.
De-constructing multi-disciplinary research:
Some of the issues raised in this paper could perhaps be resolved by deconstructing what is meant by cross- or multi-disciplinary research: is it research conducted by researchers from a number of disciplines, into one or a number of disciplinary areas? If so, perhaps we need to consider the weighting given to methodology by our co-researchers? Or, will the study be conducted by a researcher from one discipline who dips a toe into the waters of other disciplines? If this is the case, the researcher needs to consider carefully about how valid other disciplines will consider the findings, particularly if they are to be disseminated in a field new to the author of the research. This situation also raises questions about the language used in the dissemination, whether disciplinary technical language is appropriate, and whether the methodological value of the research is coherent with that of the ‘other’ discipline. Perhaps the research partnership involves researchers from various disciplines conjoining their expertise while remaining situated in their own discipline, publishing findings relevant to and within their own discipline? Until we have defined cross-disciplinary research and its audience, there is a danger that it will cross many barriers and appeal to none.