RESEARCH DESIGN & IMPLEMENTATION
Research design is a plan to answer your research question. A research method is a strategy used to implement that plan. Research design and methods are different but closely related, because good research design ensures that the data you obtain will help you answer your research question more effectively
Implementation science can be defined as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and, hence, to improve the quality and effectiveness of health services”
COMMON QUESTIONS & IMPORTANT LINKS
The challenge of designing implementation research is exacerbated by the fact that implementation research cuts across diverse scientific fields, resulting in the inevitable difficulty of identifying, appraising and synthesizing relevant literature to inform design decisions.
WHICH RESEARCH METHOD SHOULD I CHOOSE?
It depends on your research goal. It depends on what subjects (and who) you want to study. Let's say you are interested in studying what makes people happy, or why some students are more conscious about recycling on campus. To answer these questions, you need to make a decision about how to collect your data. Most frequently used methods include:
Observation / Participant Observation
Secondary Data Analysis / Archival Study
Mixed Methods (combination of some of the above)
One particular method could be better suited to your research goal than others, because the data you collect from different methods will be different in quality and quantity. For instance, surveys are usually designed to produce relatively short answers, rather than the extensive responses expected in qualitative interviews.
WHAT OTHER FACTORS SHOULD I CONSIDER WHEN CHOOSING ONE METHOD OVER ANOTHER?
Time for data collection and analysis is something you want to consider. An observation or interview method, so-called qualitative approach, helps you collect richer information, but it takes time. Using a survey helps you collect more data quickly, yet it may lack details. So, you will need to consider the time you have for research and the balance between strengths and weaknesses associated with each method (e.g., qualitative vs. quantitative).
WHAT ARE PROCESS EVALUATION DESIGNS?
To conduct process evaluations on how well services are delivered, data need to be gathered on the content of interventions and on their delivery systems. Suggested methodologies include direct observation, surveys, and record keeping.
Direct observation designs include case studies, in which participant-observers unobtrusively and systematically record encounters within a program setting, and nonparticipant observation, in which long, open-ended (or "focused") interviews are conducted with program participants.
For example, "professional customers" at counseling and testing sites can act as project clients to monitor activities unobtrusively; alternatively, nonparticipant observers can interview both staff and clients. Surveys—either censuses (of the whole population of interest) or samples—elicit information through interviews or questionnaires completed by project participants or potential users of a project. For example, surveys within community-based projects can collect basic statistical information on project objectives, what services are provided, to whom, when, how often, for how long, and in what context.
Record keeping consists of administrative or other reporting systems that monitor use of services. Standardized reporting ensures consistency in the scope and depth of data collected. To use the media campaign as an example, the panel suggests using standardized data on the use of the AIDS hotline to monitor public attentiveness to the advertisements broadcast by the media campaign.
These designs are simple to understand, but they require expertise to implement. For example, observational studies must be conducted by people who are well trained in how to carry out on-site tasks sensitively and to record their findings uniformly. Observers can either complete narrative accounts of what occurred in a service setting or they can complete some sort of data inventory to ensure that multiple aspects of service delivery are covered. These types of studies are time consuming and benefit from corroboration among several observers.
The use of surveys in research is well-understood, although they, too, require expertise to be well implemented. As the program chapters reflect, survey data collection must be carefully designed to reduce problems of validity and reliability and, if samples are used, to design an appropriate sampling scheme. Record keeping or service inventories are probably the easiest research designs to implement, although preparing standardized internal forms requires attention to detail about salient aspects of service delivery.
HOW CAN ADELVE HELP US IN OUR RESEARCH DESIGN EFFORTS?
ADELVE has the necessary knowledge and expertise to help you achieve your specific goals.
Through a structured amalgamation of scientific, academia and industry experts, we provide specific plans and implementation processes that can take the lead to outstanding end outputs.
Innovative digital tools and methods are used under a structured roadmap that will lead to optimized results.
Let us know your specific needs!