The methodology section is where your paper earns credibility. Readers decide whether to trust your findings based on how carefully you describe the process that produced them, which means the Method is not a formality — it is the ground your entire argument stands on. This guide shows you how to write a methodology section that a peer in your field could replicate: the subsections required by APA 7, the differences between quantitative and qualitative conventions, and the specificity that separates a strong Method from a vague one. You will see a worked example, learn the common failure modes, and pick up a structure you can apply to lab reports, empirical papers, and theses.
PaperDraft is a writing assistant, not a paper generator — the draft is your starting point, not your submission. You are responsible for editing, verifying sources, and following your school's academic integrity policy.
What a methodology section is — and what it is not
The methodology section — labeled Method in APA 7 papers — describes in sufficient detail how you conducted the research that a competent peer could replicate the study. It answers who participated, what materials were used, what procedure was followed, and how the data were analyzed. In APA 7, the Method section typically contains subsections titled Participants, Materials (or Measures), Procedure, and Data Analysis, though the exact subheadings vary by discipline and design.
A methodology section is not a literature review. It does not argue for why the method is appropriate — that argument belongs in the introduction or in a brief framing sentence before the subsections. A line or two of justification is acceptable; a page of justification is misplaced.
A methodology section is not a results section. It describes what you did, not what you found. If you catch yourself writing "we found that...", you have drifted into Results.
A methodology section is not a narrative of your research journey. "First I tried X, but it didn't work, so I moved to Y" is a reflection, not a method. Describe the method you ultimately used, in the order participants or data experienced it — not the order you tried things.
A methodology section is also not where you confess limitations. Limitations belong in the discussion. In the Method, describe what you did plainly and completely; leave the interpretation of those choices to the sections designed for it.
Finally, the Method is not the place to reproduce full instruments verbatim unless the venue requires it. Cite validated scales, summarize item content, and put full protocols in an appendix or supplemental materials.
Before you write
You should draft the methodology section once your study is complete or, if you are writing a proposal, once your design is finalized. For a completed study, gather the following before opening the document:
- Sample data. Final N, demographics, recruitment source, dates of data collection, attrition if any.
- Instrument details. Every scale, test, or measure — name, citation, subscales used, scoring, reliability (Cronbach's alpha, ICC, or equivalent) in your sample.
- Stimuli and apparatus. Any physical or digital materials — software versions, hardware models, experimental platforms (Qualtrics, PsychoPy, E-Prime) with versions.
- IRB or ethics approval. Protocol number, approving body, and date. Consent procedures.
- Pre-registration link, if applicable. OSF, ClinicalTrials.gov, or journal-specific registry.
- Analysis scripts. Software, packages, versions, and the analyses you actually ran.
For qualitative work, add:
- Sampling rationale. Purposive, theoretical, snowball — and why.
- Data collection details. Interview length, setting, recording method, transcription approach.
- Coding approach. Framework used (thematic, grounded theory, discourse analysis), coder agreement if used, software (NVivo, MAXQDA, Dedoose).
- Reflexivity statement. Your positionality and how it shaped data collection and analysis.
Check whether your target venue provides a reporting checklist — CONSORT for randomized trials, PRISMA for systematic reviews, STROBE for observational studies, COREQ for qualitative interviews. Using the checklist before you draft prevents the line-by-line rewrite a reviewer will otherwise demand.
Step-by-step: how to write a methodology section
1. Choose the right structure for your design
The APA 7 default for empirical quantitative work is four subheadings: Participants, Materials (or Measures), Procedure, Data Analysis. Qualitative research replaces Materials with Data Collection and adds a Reflexivity subsection. Mixed-methods studies split into quantitative and qualitative phases with their own internal structure. Secondary analyses or archival studies start with a Data Source subsection describing the original dataset. Match the structure to what you did; do not force a lab-study structure onto an interview study.
2. Describe participants completely
Report final sample size, sampling method (convenience, probability, purposive), demographics relevant to the question (age, gender, education, diagnosis as applicable), inclusion and exclusion criteria, recruitment channel, incentive if any, and consent procedure. For clinical or protected populations, describe safeguards. A reader should be able to judge whether your sample supports the generalization you will later make in the discussion.
3. Detail materials and measures
List every instrument with its full name, the citation for the instrument, the subscales you used, the response format, the scoring direction, and the reliability observed in your sample. For experimental stimuli, describe the stimulus set — how many items, what categories, whether norms exist. For software, give version numbers. Cite measures in APA format.
4. Walk through the procedure in order
Describe what happened to participants, in the order it happened. Arrival, consent, randomization, task, debrief. If procedures varied by condition, describe both. Include timing where relevant (total session length, time between sessions in a longitudinal study). The procedure is the subsection a replication attempt depends on most — spend the words.
5. Specify the analysis plan
Name the statistical tests you ran, the software and package versions, the alpha level, and any corrections (Bonferroni, Benjamini–Hochberg). For Bayesian analyses, specify priors and the tool used. For qualitative analyses, describe the coding framework, the number of coders, inter-rater agreement if calculated, and how disagreements were resolved. If you pre-registered, note which analyses were confirmatory and which were exploratory.
6. Add a reflexivity or positionality note for qualitative work
In qualitative research, your presence shapes the data. Name your role relative to participants (insider, outsider, clinician, peer), the theoretical lens you brought, and the steps you took to manage its influence — memoing, peer debriefing, member checks. Quantitative researchers do this less formally but may still declare conflicts of interest or prior involvement with the population.
7. Check for replicability
Hand the draft to a peer in your field and ask them to describe how they would replicate the study from your Method alone. Where they cannot answer, add detail. Where they ask twice, rewrite. The Method is finished when a competent peer could run it without asking you a question.
Stuck at the start? PaperDraft scaffolds a starting methodology section — the moves, the register, the structure — for you to revise. Start this paper — free.
Example methodology section
Below is an excerpt from a methodology section of a survey-based quantitative study, with commentary in brackets.
Method
Participants
[Opens with final sample, not recruitment] A total of 240 undergraduate students (M age = 20.3, SD = 1.8; 62% women, 37% men, 1% non-binary) participated in exchange for course credit. [Recruitment channel and eligibility] Participants were recruited through the department subject pool at a large public university in the midwestern United States and were eligible if they were enrolled in at least one psychology course and had not participated in prior procrastination research within the last 12 months. [Attrition and final N transparency] Of 267 who began the study, 240 completed all three time points (retention = 89.9%); analyses use complete cases. [IRB] The protocol was approved by the University IRB (protocol #2024-318), and all participants provided written informed consent.
Measures
[Scale with citation, subscales, scoring, sample reliability] Procrastination was measured with the 20-item Academic Procrastination Scale (APS; McCloskey, 2011), rated on a 5-point Likert scale (1 = disagree to 5 = agree) with higher scores indicating greater procrastination. The APS showed acceptable internal consistency in the present sample (α = .89). Self-compassion was measured with the Self-Compassion Scale–Short Form (SCS-SF; Raes et al., 2011), a 12-item measure rated on a 5-point scale (α = .86).
Procedure
[Ordered chronologically, with condition details] After consent, participants completed baseline measures online via Qualtrics (April 2024). They were then randomly assigned, using a block-randomization sequence generated in R, to either the self-compassion condition or an active control focused on implementation intentions. Both interventions consisted of four 20-minute weekly online modules delivered over four weeks. Post-intervention measures were administered at week 5, and follow-up measures at week 9.
Data Analysis
[Software, tests, and whether confirmatory] Analyses were conducted in R (version 4.3.2) using the lme4 package (Bates et al., 2015). The pre-registered primary analysis was a linear mixed-effects model with time (baseline, post, follow-up) and condition (self-compassion, control) as fixed effects and a random intercept for participant. The pre-registration is available at https://osf.io/abc123. Alpha was set at .05; no corrections were applied to the single pre-registered primary test.
Notice the specificity: exact N, exact reliability, exact software version, exact pre-registration link. That specificity is what earns the reader's trust.
Common mistakes
- Vague participant description. "University students" is not a sample. Report N, demographics, recruitment, and eligibility.
- Missing instrument citations. Every validated scale has a citation and a reliability estimate. Missing either suggests the instrument was not properly sourced.
- Narrative past about the research journey. "We initially tried X, but..." belongs in a reflection, not a Method. Describe what you ultimately did.
- Collapsing method into one paragraph. Subheadings exist to help readers navigate. Use them. A Method section without Participants / Materials / Procedure structure looks amateurish in APA-style disciplines.
- Dropping the ethics statement. IRB or ethics-board approval and consent procedures are required by most journals and by most instructors. Omitting them is a credibility failure that also carries integrity implications — see our academic responsibility guide.
How PaperDraft helps you start
PaperDraft is a writing assistant, not a paper generator — the draft is your starting point, not your submission. For methodology sections, PaperDraft scaffolds a starting structure with the right subsections for your study design, standard APA 7 headings, and placeholder prompts for the specifics you need to fill in: exact N, reliability, software versions, IRB details. You rewrite those placeholders with the real numbers from your study, verify every instrument citation, and add the discipline-specific detail that makes the method replicable. Start scaffolding at PaperDraft's research-paper workflow, and check the APA citation guide for formatting the instrument and software references.
Frequently asked questions
What tense should I use in the Method?
Past tense for what you did — "Participants completed the scale" — and present tense for descriptions of materials that still exist — "The APS is a 20-item measure." Mixing the two correctly is a small but noticeable register marker.
How long should the Method be?
Variable. In a 6,000-word empirical paper, Method is often 800–1,500 words; in a lab report it may be shorter; in a thesis chapter it can run several thousand. Replicability is the standard, not length. See our guide on writing research paper introductions for how Method relates to the framing that precedes it.
Do I cite sources in the Method?
Yes — every instrument, procedure adapted from prior work, software package, and analysis approach needs a citation. Method is less citation-dense than Introduction but not citation-free.
What if my study was exploratory and I made changes as I went?
Describe what you ultimately did, and be transparent about what was pre-planned versus exploratory. Many venues now ask you to label confirmatory and exploratory analyses separately. Honesty about this is part of credible method reporting.
Should I disclose if I used an AI tool in my research process?
Yes, if it touched the research — for example, if an LLM helped with coding transcripts, generating stimuli, or analyzing data. Disclose the tool, the version, the prompt or role it played, and the verification you performed. See our AI disclosure guide.