How to Write a Materials and Methods Section: Key Components

The materials and methods section explains exactly what you did in your study, in enough detail that another researcher could repeat it and get the same results. Think of it as a recipe: every ingredient, every step, every measurement matters. Getting it right is one of the most straightforward parts of writing a paper, but it requires discipline and attention to specifics that are easy to overlook.

The Core Purpose: Answering “How?”

Every sentence in your methods section should answer one of two questions: “How did you do it?” or “How much?” For every parameter you measured, the reader should understand exactly which criteria you used and how you evaluated it. This isn’t just a formality. Reproducibility is what makes a study robust, and the methods section is where you prove yours can hold up to scrutiny. Reviewers will evaluate the reliability of your entire investigation based on what you write here, so vague descriptions or missing steps will raise red flags before anyone even looks at your results.

Standard Components to Include

While the exact sub-sections vary by field and journal, most methods sections cover five core elements: study design, setting and subjects, data collection, data analysis, and ethical approval. Organizing your section around these categories gives readers a predictable structure they can navigate quickly.

Study Design

State what type of study you conducted: randomized controlled trial, cohort study, case-control, cross-sectional, systematic review, or another design. This single sentence frames everything that follows. For a randomized controlled trial, briefly explain how participants were assigned to groups and what the intervention and control conditions were.

Setting and Subjects

Describe where the study took place and when. Include the relevant dates for recruitment, exposure periods, follow-up, and data collection. Define your study population with clear inclusion and exclusion criteria so readers understand exactly who qualified and who didn’t. State your primary and secondary outcome measures here as well, since the reader needs to know what you were measuring before you explain how you measured it.

Data Collection

This is where you walk through your procedures step by step. Identify all your variables: independent (what you manipulated or compared), dependent (what you measured), and controlled (what you held constant). Every measurement tool, survey instrument, or technique should be named and described clearly enough that someone unfamiliar with your lab could follow along.

Data Analysis

Explain how you determined your sample size, typically through a power analysis that shows the study was large enough to detect a meaningful difference. Name the statistical tests you used and why they were appropriate for your data type. If you examined subgroups or ran secondary analyses, describe those too. Include the name and version of the statistical software you used.

Ethical Approval

State that your study received approval from an Institutional Review Board or ethics committee. For clinical trials, include the registration number from ClinicalTrials.gov or an equivalent registry. If your study involved human participants, note that informed consent was obtained. This section can be brief, but it cannot be skipped.

How to Report Materials and Equipment

When you mention a reagent, kit, instrument, or piece of software, include enough identifying information that someone could purchase or access the exact same thing. At minimum, provide the product name, manufacturer, and the manufacturer’s location (city, state or country). Many journals also require catalog numbers, and for biological reagents, lot numbers can matter because formulations change between production batches. If you used a custom-built piece of equipment or wrote your own software, describe it in enough detail that another team could build or code something equivalent, or note where the design files or source code are available.

For commonly used techniques or published protocols, you don’t need to describe every step from scratch. A citation to the original method is sufficient, but you should note any modifications you made. The rule of thumb: if you changed nothing, cite it; if you changed something, cite the original and describe what you did differently.

Writing Style and Tense

Write the methods section in past tense. You’re describing work that has already been completed, so “samples were collected” and “we measured” are both correct, while “samples are collected” is not. This convention holds across scientific disciplines and distinguishes the methods from other sections of the paper. Your introduction, for comparison, uses present tense to discuss established knowledge, and your discussion moves between past and present.

The choice between active voice (“we measured the temperature”) and passive voice (“the temperature was measured”) is less rigid than it used to be. Many journals, including Nature, now encourage active voice because it’s more direct and easier to read. Passive voice still works well when the action matters more than who performed it, or when you want to keep the same grammatical subject across several consecutive sentences for smoother reading. The best methods sections use both, choosing whichever makes each sentence clearer.

Organizing the Section

For most studies, a chronological structure works well: describe what you did in the order you did it. Start with study design and participant selection, move through your experimental procedures, and end with statistical analysis. This mirrors the logical flow of the research and makes it easy for readers to follow your process from start to finish.

For complex studies with multiple independent experiments or techniques, a thematic organization often works better. Group related procedures under descriptive sub-headings (e.g., “Cell Culture,” “Western Blot Analysis,” “Behavioral Testing”) so readers can find specific protocols without reading the entire section linearly. Many journals in the biological and biomedical sciences expect this approach. Whichever structure you choose, be consistent. Don’t jump back and forth between experimental phases or introduce a technique in one sub-section only to add crucial details about it three sub-sections later.

Using Reporting Guidelines

Reporting guidelines are standardized checklists that tell you exactly what information to include for your specific study type. Using the right one dramatically reduces the chance of leaving out something important, and many journals now require that you submit a completed checklist alongside your manuscript.

The most widely used guidelines, maintained by the EQUATOR Network, include:

  • CONSORT for randomized controlled trials (updated in 2025)
  • STROBE for observational studies (cohort, case-control, cross-sectional)
  • PRISMA for systematic reviews and meta-analyses
  • STARD for diagnostic accuracy studies
  • ARRIVE 2.0 for animal research
  • CARE for case reports
  • SRQR or COREQ for qualitative research
  • CHEERS for health economic evaluations
  • SQUIRE for quality improvement studies

ARRIVE 2.0, for example, divides its 21 items into an “Essential 10” that represents the minimum reporting requirement and a recommended set that adds further context. For randomized trials, CONSORT requires a flow diagram showing how participants moved through each stage of the study, from enrollment to analysis. Check which guideline matches your study type and use it as a structural template before you start writing.

Mistakes That Get Manuscripts Rejected

The single most common problem is insufficient detail. If a reader cannot understand exactly what you did and replicate it, the methods section has failed its purpose. This is especially true for custom protocols, novel techniques, or any step where you deviated from a published method. Err on the side of too much detail rather than too little; reviewers will tell you to trim, but they’re far more likely to send the paper back because they can’t figure out what you actually did.

Another frequent mistake is slipping results into the methods section. The methods describe what you planned and executed. The moment you start reporting what you found, you’ve crossed into results territory. Keep the two cleanly separated. Similarly, don’t justify your methodological choices with lengthy arguments. A brief phrase explaining why you chose a particular approach is fine (“because of the non-normal distribution of the data, we used a non-parametric test”), but extended rationales belong in the discussion.

Omitting statistical details is a subtler problem that nonetheless undermines credibility. Stating that you “used statistical analysis” without specifying the tests, significance threshold, or software version tells the reader almost nothing. Name the specific tests, report your alpha level (typically 0.05), describe how you handled missing data or outliers, and identify the software and its version number.

Finally, forgetting to mention ethical approval or trial registration is an easy fix that can delay publication significantly. Even if your study was exempt from full review, state that explicitly and name the body that granted the exemption.