Perplexity Case Study: How an Investigative Journalist Used AI Search to Break a Story in Half the Time
The Assignment: Connect the Dots Across 500 Data Points
A freelance investigative journalist was working on a story about a private equity firm’s pattern of acquiring healthcare companies, cutting staff, and raising prices. The hypothesis was not new — similar patterns had been reported for other PE firms. But this specific firm had not been investigated, and the journalist needed to build the case from public records.
The research required:
- Identifying every healthcare company the PE firm had acquired in the past 5 years
- For each acquisition: tracking staffing changes, pricing changes, and patient outcomes
- Cross-referencing SEC filings, state healthcare databases, CMS data, court records, and news reports
- Finding former employees willing to speak on record
- Verifying every claim with at least two independent sources
Traditional investigative research timeline: 4-6 weeks for the research phase alone, followed by 2-3 weeks of writing and fact-checking.
The journalist used Perplexity as a research accelerator — not to replace investigative rigor but to compress the hours spent on initial discovery, cross-referencing, and dead-end elimination.
The Research Workflow
Phase 1: Mapping the Acquisition History (Day 1-2)
"List all healthcare-related acquisitions made by [PE Firm] and its subsidiaries between 2021-2026. For each acquisition: 1. Target company name 2. Date of acquisition 3. Acquisition price (if public) 4. Type of healthcare business (hospital, clinic, home health, pharmacy, medical device) 5. Number of locations/facilities at time of acquisition 6. Source for each data point Cross-reference: SEC filings, press releases, PitchBook data, and news reports. Note any acquisitions that were reported but not officially confirmed."
Perplexity identified 14 acquisitions with citations. The journalist verified each against SEC filings and added 3 more that Perplexity missed (smaller transactions not covered in major publications).
Time comparison:
- Traditional (searching SEC EDGAR, PitchBook, news archives manually): 8-12 hours
- With Perplexity (initial discovery + verification): 4 hours
- Savings: 50%
Phase 2: Post-Acquisition Analysis (Day 3-8)
For each of the 17 acquired companies, the journalist ran targeted queries:
"For [Acquired Company], since being acquired by [PE Firm] in [year]: 1. Any reported layoffs or staff reductions? (with source) 2. Any price increases reported by patients, media, or regulators? 3. Any regulatory actions, citations, or investigations? 4. Any lawsuits filed by patients, employees, or competitors? 5. Any quality ratings changes (CMS star ratings, Joint Commission)? 6. Any leadership changes (CEO/administrator turnover)? Focus on verifiable facts with citations, not opinions."
Phase 3: Pattern Identification (Day 9-10)
"Based on the 17 healthcare companies acquired by [PE Firm], analyze the pattern: 1. What is the average staff reduction across acquisitions? (with specific numbers per company) 2. What is the average price increase reported? 3. What is the pattern of regulatory actions before vs. after acquisition? 4. What is the executive turnover rate post-acquisition? 5. Are there any acquired companies that did NOT follow this pattern? (important for balance) Present as a data table with source citations for each cell."
Phase 4: Source Verification (Day 11-14)
Every claim in the story was verified through Perplexity’s citation verification workflow:
"Verify this specific claim: '[claim from the story].' Find at least 2 independent sources that confirm this. If sources conflict, note the discrepancy. If only one source can be found, note that the claim is single-sourced."
The journalist also used Perplexity to find potential interview subjects:
"Find former executives or administrators of [Acquired Company] who have spoken publicly about the acquisition or about [PE Firm]. Check: LinkedIn profiles (for career history), news quotes, conference presentations, court testimony, and industry forum posts."
Results
Time Savings
| Research Phase | Traditional Time | With Perplexity | Savings |
|---|---|---|---|
| Acquisition mapping | 8-12 hours | 4 hours | 50-67% |
| Post-acquisition analysis | 40-60 hours | 20-25 hours | 50-58% |
| Pattern identification | 10-15 hours | 4-6 hours | 60% |
| Source verification | 15-20 hours | 10-12 hours | 33-40% |
| Total research | 73-107 hours | 38-47 hours | ~50% |
The story was published 2.5 weeks after assignment — versus the typical 6-8 week timeline.
Story Quality
The editor noted: “The research was more thorough than typical. The acquisition table had more data points than I expected. The pattern analysis was stronger because the journalist had data across all 17 companies, not just the 5-6 they would have had time to deeply research manually.”
What Perplexity Did Well
- Cross-referencing speed: finding the same company mentioned in SEC filings, news articles, and court records simultaneously
- Citation quality: inline citations allowed quick verification
- Dead-end elimination: queries that returned “no relevant information found” saved hours of fruitless manual searching
- Pattern surfacing: synthesizing data across 17 companies revealed patterns that manual research might miss
What Perplexity Could Not Do
- Access paywalled databases: state healthcare regulatory databases, CMS quality data, and court records required direct access
- Interview sources: finding people to talk to required LinkedIn outreach, phone calls, and relationship building
- Verify nuance: Perplexity could confirm a staff reduction happened but could not determine whether it was a routine restructuring or a cost-cutting measure that affected patient care
- Editorial judgment: deciding what was newsworthy, how to frame the story, and which details to include required human judgment
Journalism-Specific Verification Standards
The journalist maintained these standards:
Verification rules: 1. No claim appears in the story solely from Perplexity output 2. Every fact has at least one PRIMARY source (official record, direct interview, court document) 3. Perplexity is used for DISCOVERY (finding leads) and CROSS-REFERENCING (finding additional sources), not as a final source 4. All Perplexity citations are clicked and verified 5. Claims from anonymous sources are corroborated by at least one documented source 6. The story explicitly does NOT cite "AI research" as a source
Lessons for Journalists and Researchers
AI Search Is a Discovery Tool, Not a Source
Perplexity helped the journalist find things faster. It did not replace the investigative work of reading documents, interviewing sources, and making editorial judgments. The time savings came from eliminating dead ends and accelerating the discovery phase.
Verification Discipline Must Be Absolute
In journalism, a single unverified claim can destroy credibility. Every Perplexity finding was treated as a lead to be verified, not a fact to be cited. This discipline is the difference between AI-assisted journalism and AI-generated content.
The Best Use Is Cross-Referencing
Where Perplexity added the most value was connecting dots — finding the same entity mentioned across different document types (SEC filing + news article + court record). This cross-referencing work is tedious for humans but fast for AI search.
Frequently Asked Questions
Is it ethical for journalists to use AI search tools?
Yes, with appropriate verification standards. AI search tools are research accelerators — like Google, LexisNexis, or PACER. The ethical obligation is verification, not the avoidance of tools.
Should journalists disclose AI tool usage?
Disclosure practices are evolving. Most newsrooms do not require disclosure of search tools used (you do not disclose using Google). However, if AI was used to generate any text that appears in the story, disclosure is required.
Can Perplexity access public records databases?
Perplexity searches the open web, which includes some public records that are published online. For database-specific searches (SEC EDGAR, PACER, state databases), direct access is still necessary.
How reliable is Perplexity for investigative leads?
Perplexity is highly reliable for finding public information that has been published online. It is not reliable as a sole source — every lead must be independently verified. Treat Perplexity findings as “things to verify,” not “things that are true.”