If you've ever Googled "natural testosterone booster," you've encountered the number 42%. It appears on product labels, in marketing emails, on landing pages, and in breathless forum posts. "Clinically proven to increase testosterone by 42% in just 12 days!" The claim traces back to a single study, and it has generated hundreds of millions of dollars in supplement sales over the past fifteen years.
The study is real. The number is real. But the story the supplement industry tells about it leaves out almost everything that matters. Here's what actually happened — and what happened when other scientists tried to replicate the result.
The Study That Started It All
In 2009, Enzo Topo and colleagues published a study in the journal Reproductive Biology and Endocrinology examining the effect of D-aspartic acid (DAA) on testosterone levels. D-aspartic acid is an amino acid — one of the building blocks of proteins — that plays a role in the neuroendocrine system. Specifically, it's involved in the synthesis and release of luteinizing hormone (LH) in the pituitary gland, and it's found in Leydig cells where testosterone is produced.
The biological rationale was sound. If DAA is involved in the signaling cascade that triggers testosterone synthesis, supplementing with it might enhance that signaling. Reasonable hypothesis. The kind of thing worth testing.
Topo's study gave 23 men a dose of 3.12 grams of DAA daily for 12 days. After the supplementation period, the researchers reported a 42% increase in testosterone levels in the DAA group. LH also increased by 33%. These are enormous effect sizes — the kind of numbers that make you sit up in your chair.
And that's exactly what the supplement industry did. Within months, D-aspartic acid went from an obscure amino acid to a headline ingredient in testosterone boosters worldwide. The 42% number became the anchor for an entire product category.
The Problems with the Topo Study
When I actually sat down with the Topo 2009 paper, the red flags were immediate. This is what happens when you read the methodology instead of just the abstract.
Sample size: 23 men. Twenty-three. In a field where individual hormonal variation can swing testosterone levels by 30% or more from one blood draw to the next, 23 participants is not a foundation for any confident conclusion. A single outlier — one man who happened to be at a hormonal nadir at baseline and recovered to his normal level during the study — could meaningfully move the group average. With 23 people, statistical power is razor-thin.
No placebo control group. The study compared the DAA group to a "control" group that received nothing. Not a placebo — nothing. The participants knew whether they were taking the supplement or not. In any study measuring subjective or physiologically variable outcomes, the absence of blinding is a critical flaw. Testosterone levels fluctuate with sleep, stress, diet, time of day, and psychological state. Knowing you're taking a "testosterone booster" can influence all of those variables.
Duration: 12 days. Twelve days is not sufficient to establish a sustained hormonal effect. Testosterone levels fluctuate on daily, weekly, and monthly cycles. A 12-day snapshot can capture a natural fluctuation and mistake it for a treatment effect, especially in a small sample without proper blinding.
No follow-up. The study didn't track what happened after supplementation stopped. Did the effect persist? Did testosterone rebound? Was the "increase" simply a transient fluctuation? We don't know, because the study ended at the peak of its own narrative.
If you apply the evidence framework I've outlined previously, the Topo study fails on multiple criteria: sample size, blinding, duration, and study design. It generates a hypothesis worth testing. It does not constitute evidence.
What Happened When Better Studies Were Done
The real test of any finding is replication. If DAA truly increases testosterone by 42%, subsequent studies should show similar effects. They didn't.
Willoughby and Leutholtz, 2013. Published in the Journal of the International Society of Sports Nutrition, this study used a proper randomized, double-blind, placebo-controlled design. Participants were resistance-trained men — the exact population most likely to be buying DAA supplements. They received 3 grams of DAA daily for 28 days, more than double the duration of the Topo study. The results: no significant changes in total testosterone, free testosterone, or any other androgen measured. The 42% effect vanished completely under proper experimental conditions.
Melville et al., 2015. Published in Nutrition Research, this study went further and tested two doses: 3 grams and 6 grams of DAA daily for 14 days in resistance-trained men. The 3g dose produced no significant hormonal changes — consistent with Willoughby's findings. But the 6g dose produced a result nobody expected: testosterone levels actually decreased. Not by a trivial amount, either. The higher dose of the compound being sold as a testosterone booster appeared to suppress the very hormone it was supposed to enhance.
The Dose Paradox
The Melville finding raises an important mechanistic question. If DAA is involved in the testosterone production signaling cascade, why would higher doses suppress testosterone?
The most plausible explanation involves aromatase — the enzyme that converts testosterone into estrogen. There is preliminary evidence suggesting that DAA may upregulate aromatase activity at higher concentrations. If supplementing with DAA simultaneously stimulates some testosterone production and increases the rate at which that testosterone is converted to estrogen, the net effect could be zero — or negative. The body's hormonal system is not a simple input-output machine. Pushing one lever often moves others in compensatory or counterproductive ways.
This is a recurring theme in testosterone supplementation: compounds that look promising in isolation interact with the body's feedback mechanisms in ways that neutralize or reverse the intended effect. The HPG axis is a tightly regulated system. It does not appreciate being pushed from the outside.
Why One Outlier Study Doesn't Equal Evidence
The D-aspartic acid story is a useful case study in how supplement marketing works. The process is simple:
- Find one study — ideally small, short, and with a dramatic headline number
- Build an entire marketing campaign around that single result
- Ignore every subsequent study that fails to replicate the finding
- Continue citing the original study for years, sometimes decades, as if the follow-up data doesn't exist
This is exactly what happened with tribulus terrestris and animal studies. It's what happened with DAA and the Topo study. And it keeps working because most consumers don't read past the marketing claim to check whether the cited study was ever replicated.
Replication is the cornerstone of scientific evidence. A single study, no matter how dramatic the results, is a data point. Multiple studies replicating the same finding under rigorous conditions — that's evidence. The DAA literature gives us one poorly controlled study showing an effect, followed by multiple well-controlled studies showing no effect or a negative effect. That's not ambiguous. That's a clear answer.
The Verdict
One poorly controlled study from 2009 launched a multi-million dollar ingredient. Every well-designed follow-up has failed to replicate it. The evidence is clear.
D-aspartic acid does not reliably increase testosterone in healthy men. At higher doses, it may actually suppress it. The 42% number that appears on product labels and marketing pages is drawn from a study that would not survive peer review by today's standards — and the subsequent literature has thoroughly contradicted its findings.
If you're currently spending money on a supplement that features DAA as a key ingredient, the data says you're paying for an amino acid that will be metabolized, excreted, and forgotten by your endocrine system. Your testosterone levels won't notice.
For those looking for what actually has replicable clinical evidence — compounds that have been tested in randomized, double-blind, placebo-controlled trials and produced meaningful, consistent results — I put together a detailed analysis of one such compound in my Shilajit clinical data review. The contrast with DAA is instructive.
For the framework I use to evaluate all supplement claims, see Why Most Supplement Studies Are Worthless. For the hormonal science that explains why most testosterone boosters fail, see my HPG axis deep dive.