MLRC 2026: Reproducibility as an Official Track at NeurIPS
When Joelle Pineau launched the first Machine Learning Reproducibility Challenge at ICLR 2018, it was a small community experiment: could we systematically invite researchers to reproduce published results and share what they found?
Eight years and eight editions later, we are delighted to announce that MLRC 2026 will be an official track at NeurIPS 2026 – the first time in MLRC’s history that reproducibility science has a dedicated home inside a major ML conference.
This milestone reflects the community’s growing relationship with reproducibility. NeurIPS wanted to send a meaningful signal to the field that reproducibility has become a scientific question worthy of its own rigorous study – this official track allows that to happen.
From challenge to venue
The early iterations of MLRC (v1, v2, v3) were structured as challenges: pick a paper, try to reproduce it, and report what happened. They were enormously valuable as an educational exercise, especially for early-career researchers. Courses like FACT AI at the University of Amsterdam built entire curricula around the challenge, and the quality of the work that came out of those courses showed just how seriously students took the opportunity.
As we reflected on what we learned across those iterations, it became clear that MLRC should be more. Reproducibility is not a binary outcome: a paper is never simply “reproducible” or “not.” The most insightful submissions were always the ones that engaged deeply with the specific claims of a paper, pushed those claims into new settings (generalization), tested their limits, discovered novel insights on top of the paper’s claims, and reported back with nuance. We wanted to create a venue that actively sought out that kind of contribution. Over the years, we improved the program to incentivize these submissions.
Reproducibility studies are also not easily rewarded by the ML community’s standard metric of novelty: they do not propose new architectures, beat state-of-the-art numbers, or introduce new datasets. Getting researchers to invest serious effort in this kind of work required building a publication and recognition path that made that investment worthwhile. We have updated this systematically over the years: early iterations published through ReScience, a respected open journal for reproducibility across computational science; then, in 2023, we transitioned to TMLR, bringing MLRC papers into a high-prestige, well-indexed ML venue with a rigorous open review process.
While the initial MLRC operated primarily as satellite workshops to conferences, in 2022 (and 2023), we partnered with NeurIPS in the Journal to Conference track, where the reproducibility papers were presented in poster sessions alongside the conference papers. In 2025, we ran MLRC’s first in-person conference at Princeton University, further elevating the incentive in the form of a dedicated conference and physical stage, inviting keynote speakers to present, and having a full-day event with orals, posters, and networking sessions. Each step raised the incentives, and the NeurIPS track is the next step in that progression.
What this means in practice
MLRC 2026 accepted reproducibility papers will be presented in person at NeurIPS 2026 in Sydney, Australia (December 6–13, 2026), alongside papers from the Main Track and the Evaluations & Datasets Track. The submission and review process remains anchored in TMLR. Reproducibility papers must first be accepted at TMLR within the eligibility window, and then undergo a light compatibility review by the MLRC committee to confirm suitability for the track.
This TMLR-first model is deliberate. TMLR’s open, continuous reviewing cycle allows authors to refine their work and get expert feedback before it is considered for presentation. It also means that accepted papers carry the full weight of TMLR’s review standards, independent of MLRC. The MLRC committee’s role is then to identify papers among accepted TMLR submissions that represent the best of reproducibility science and would benefit from the visibility of a NeurIPS venue.
What we are looking for
MLRC has always welcomed a broad range of reproducibility work, and that continues this year. We are looking for papers that take reproducibility seriously as a scientific question — not just as a means to an end, but as a contribution in its own right. This includes:
- Reproductions and replications that rigorously test specific claims from published papers, whether they confirm, partially replicate, or fail to reproduce prior results
- Generalizability studies that extend original findings to new settings, datasets, or model architectures, adding insights that the original paper could not offer
- Meta-reproducibility studies examining reproducibility patterns across a body of related work
- Methods and tools that make reproducibility research more accessible or rigorous
- AI-assisted reproducibility, including studies that use or critically evaluate automated approaches to replicating research papers
- Reproducibility of AI systems and agents as subjects of study in their own right
An important point to note: negative results and partial failures to reproduce are as valuable as confirmations. Science advances by understanding where claims hold and where they do not. A careful, well-documented failure to reproduce a result — with a clear account of what was tried and what was found — is a genuine contribution to the literature.
Note: Work focused on evaluation methodology more broadly may also be a good fit for the Evaluations & Datasets NeurIPS 2026 track. We encourage authors to consider both venues when deciding where to submit.
How to submit
To be eligible for MLRC 2026, your reproducibility paper must be accepted as is / with minor revisions to TMLR and must have been submitted to TMLR between June 20, 2025 23:59 AOE and September 30, 2026, 23:59 AOE. Please check our CFP for more details.

We accept submissions through three paths:
Path 1: Expression of interest (EOI) before acceptance. If your reproducibility paper is currently under review at TMLR and was submitted within the eligibility window, you may submit this EOI form to express your intent to be considered for MLRC. The “intent to submit” deadline is June 4, 2026, AOE – this is primarily a soft deadline to allow you to submit your paper to TMLR well in advance, so that you get your decisions in time. If your reproducibility paper is subsequently accepted to TMLR, you will be asked to update the form with your acceptance details and camera-ready materials. The hard deadline by which we need your TMLR paper’s acceptance decision is September 30, 2026, AOE – we cannot accommodate your paper into MLRC any later than this date, even if you are still waiting for your TMLR decisions.
Path 2: Self-nomination after acceptance. If your reproducibility paper has already been accepted to TMLR and was submitted within the eligibility window (start date June 20, 2025 23:59 AOE), you may submit this form to be considered for MLRC. Once accepted to MLRC, you will be asked to update the form with your camera-ready details. Similar to Path 1, the hard deadline to submit this form is September 30, 2026, AOE. Ensure you have not submitted your accepted paper to the NeurIPS 2026 Journal to Conference track, as our dual submission policy restricts dual presentation.
Path 3: Area Chair nomination. TMLR Area Chairs may nominate accepted reproducibility papers within the submission window for consideration at MLRC. No action is required from authors unless they are contacted. Area Chairs can use the same form to submit their nomination. The deadline for TMLR Area Chairs to nominate reproducibility papers is also September 30, 2026, AOE.
For all paths, please read the TMLR author guidelines and submission guidelines for formatting and review process details.
The hard deadline to have your TMLR acceptance in our system is September 30, 2026 AOE.
Important Dates
- Earliest date of the acceptance of your TMLR reproducibility paper for consideration to this year’s MLRC: June 20, 2025 23:59 AOE
- Soft deadline (EOI / intent to submit): June 4, 2026, 23:59 AOE
- Deadline to have TMLR decisions in our system: September 30, 2026, 23:59 AOE
- Author notifications: October 7, 2026
- NeurIPS 2026: December 6–13, 2026
Closing Thoughts
Reproducibility has always been foundational to science. What MLRC has tried to do, across eight editions, is make reproducibility a first-class research activity in machine learning — one that is worth investing in, publishing, and being recognized for. Having MLRC as an official NeurIPS track is an affirmation that the community values this work, and we hope it encourages more researchers to take reproducibility seriously as a scientific contribution.
We look forward to seeing the community’s work at NeurIPS in Sydney. Please visit the MLRC 2026 website for full details, and do not hesitate to reach out at at reproducibility-chairs@neurips.cc with any questions.