Perplexity Case Study: How an Investigative Journalist Used AI Search to Break a Story in Half the Time

The Assignment: Connect the Dots Across 500 Data Points

A freelance investigative journalist was working on a story about a private equity firm’s pattern of acquiring healthcare companies, cutting staff, and raising prices. The hypothesis was not new — similar patterns had been reported for other PE firms. But this specific firm had not been investigated, and the journalist needed to build the case from public records.

The research required:

  • Identifying every healthcare company the PE firm had acquired in the past 5 years
  • For each acquisition: tracking staffing changes, pricing changes, and patient outcomes
  • Cross-referencing SEC filings, state healthcare databases, CMS data, court records, and news reports
  • Finding former employees willing to speak on record
  • Verifying every claim with at least two independent sources

Traditional investigative research timeline: 4-6 weeks for the research phase alone, followed by 2-3 weeks of writing and fact-checking.

The journalist used Perplexity as a research accelerator — not to replace investigative rigor but to compress the hours spent on initial discovery, cross-referencing, and dead-end elimination.

The Research Workflow

Phase 1: Mapping the Acquisition History (Day 1-2)

"List all healthcare-related acquisitions made by [PE Firm]
and its subsidiaries between 2021-2026. For each acquisition:
1. Target company name
2. Date of acquisition
3. Acquisition price (if public)
4. Type of healthcare business (hospital, clinic, home health,
   pharmacy, medical device)
5. Number of locations/facilities at time of acquisition
6. Source for each data point

Cross-reference: SEC filings, press releases, PitchBook data,
and news reports. Note any acquisitions that were reported
but not officially confirmed."

Perplexity identified 14 acquisitions with citations. The journalist verified each against SEC filings and added 3 more that Perplexity missed (smaller transactions not covered in major publications).

Time comparison:

  • Traditional (searching SEC EDGAR, PitchBook, news archives manually): 8-12 hours
  • With Perplexity (initial discovery + verification): 4 hours
  • Savings: 50%

Phase 2: Post-Acquisition Analysis (Day 3-8)

For each of the 17 acquired companies, the journalist ran targeted queries:

"For [Acquired Company], since being acquired by [PE Firm]
in [year]:
1. Any reported layoffs or staff reductions? (with source)
2. Any price increases reported by patients, media, or regulators?
3. Any regulatory actions, citations, or investigations?
4. Any lawsuits filed by patients, employees, or competitors?
5. Any quality ratings changes (CMS star ratings, Joint Commission)?
6. Any leadership changes (CEO/administrator turnover)?

Focus on verifiable facts with citations, not opinions."

Phase 3: Pattern Identification (Day 9-10)

"Based on the 17 healthcare companies acquired by [PE Firm],
analyze the pattern:
1. What is the average staff reduction across acquisitions?
   (with specific numbers per company)
2. What is the average price increase reported?
3. What is the pattern of regulatory actions before vs. after acquisition?
4. What is the executive turnover rate post-acquisition?
5. Are there any acquired companies that did NOT follow
   this pattern? (important for balance)

Present as a data table with source citations for each cell."

Phase 4: Source Verification (Day 11-14)

Every claim in the story was verified through Perplexity’s citation verification workflow:

"Verify this specific claim: '[claim from the story].'
Find at least 2 independent sources that confirm this.
If sources conflict, note the discrepancy. If only one
source can be found, note that the claim is single-sourced."

The journalist also used Perplexity to find potential interview subjects:

"Find former executives or administrators of [Acquired Company]
who have spoken publicly about the acquisition or about
[PE Firm]. Check: LinkedIn profiles (for career history),
news quotes, conference presentations, court testimony,
and industry forum posts."

Results

Time Savings

Research PhaseTraditional TimeWith PerplexitySavings
Acquisition mapping8-12 hours4 hours50-67%
Post-acquisition analysis40-60 hours20-25 hours50-58%
Pattern identification10-15 hours4-6 hours60%
Source verification15-20 hours10-12 hours33-40%
Total research73-107 hours38-47 hours~50%

The story was published 2.5 weeks after assignment — versus the typical 6-8 week timeline.

Story Quality

The editor noted: “The research was more thorough than typical. The acquisition table had more data points than I expected. The pattern analysis was stronger because the journalist had data across all 17 companies, not just the 5-6 they would have had time to deeply research manually.”

What Perplexity Did Well

  • Cross-referencing speed: finding the same company mentioned in SEC filings, news articles, and court records simultaneously
  • Citation quality: inline citations allowed quick verification
  • Dead-end elimination: queries that returned “no relevant information found” saved hours of fruitless manual searching
  • Pattern surfacing: synthesizing data across 17 companies revealed patterns that manual research might miss

What Perplexity Could Not Do

  • Access paywalled databases: state healthcare regulatory databases, CMS quality data, and court records required direct access
  • Interview sources: finding people to talk to required LinkedIn outreach, phone calls, and relationship building
  • Verify nuance: Perplexity could confirm a staff reduction happened but could not determine whether it was a routine restructuring or a cost-cutting measure that affected patient care
  • Editorial judgment: deciding what was newsworthy, how to frame the story, and which details to include required human judgment

Journalism-Specific Verification Standards

The journalist maintained these standards:

Verification rules:
1. No claim appears in the story solely from Perplexity output
2. Every fact has at least one PRIMARY source (official record,
   direct interview, court document)
3. Perplexity is used for DISCOVERY (finding leads) and
   CROSS-REFERENCING (finding additional sources), not as a
   final source
4. All Perplexity citations are clicked and verified
5. Claims from anonymous sources are corroborated by at least
   one documented source
6. The story explicitly does NOT cite "AI research" as a source

Lessons for Journalists and Researchers

AI Search Is a Discovery Tool, Not a Source

Perplexity helped the journalist find things faster. It did not replace the investigative work of reading documents, interviewing sources, and making editorial judgments. The time savings came from eliminating dead ends and accelerating the discovery phase.

Verification Discipline Must Be Absolute

In journalism, a single unverified claim can destroy credibility. Every Perplexity finding was treated as a lead to be verified, not a fact to be cited. This discipline is the difference between AI-assisted journalism and AI-generated content.

The Best Use Is Cross-Referencing

Where Perplexity added the most value was connecting dots — finding the same entity mentioned across different document types (SEC filing + news article + court record). This cross-referencing work is tedious for humans but fast for AI search.

Frequently Asked Questions

Is it ethical for journalists to use AI search tools?

Yes, with appropriate verification standards. AI search tools are research accelerators — like Google, LexisNexis, or PACER. The ethical obligation is verification, not the avoidance of tools.

Should journalists disclose AI tool usage?

Disclosure practices are evolving. Most newsrooms do not require disclosure of search tools used (you do not disclose using Google). However, if AI was used to generate any text that appears in the story, disclosure is required.

Can Perplexity access public records databases?

Perplexity searches the open web, which includes some public records that are published online. For database-specific searches (SEC EDGAR, PACER, state databases), direct access is still necessary.

How reliable is Perplexity for investigative leads?

Perplexity is highly reliable for finding public information that has been published online. It is not reliable as a sole source — every lead must be independently verified. Treat Perplexity findings as “things to verify,” not “things that are true.”

Explore More Tools

Grok Best Practices for Academic Research and Literature Discovery: Leveraging X/Twitter for Scholarly Intelligence Best Practices Grok Best Practices for Content Strategy: Identify Trending Topics Before They Peak and Create Content That Captures Demand Best Practices Grok Case Study: How a DTC Beauty Brand Used Real-Time Social Listening to Save Their Product Launch Case Study Grok Case Study: How a Pharma Company Tracked Patient Sentiment During a Drug Launch and Caught a Safety Signal 48 Hours Before the FDA Case Study Grok Case Study: How a Disaster Relief Nonprofit Used Real-Time X/Twitter Monitoring to Coordinate Emergency Response 3x Faster Case Study Grok Case Study: How a Political Campaign Used X/Twitter Sentiment Analysis to Reshape Messaging and Win a Swing District Case Study How to Use Grok for Competitive Intelligence: Track Product Launches, Pricing Changes, and Market Positioning in Real Time How-To Grok vs Perplexity vs ChatGPT Search for Real-Time Information: Which AI Search Tool Is Most Accurate in 2026? Comparison How to Use Grok for Crisis Communication Monitoring: Detect, Assess, and Respond to PR Emergencies in Real Time How-To How to Use Grok for Product Improvement: Extract Customer Feedback Signals from X/Twitter That Your Support Team Misses How-To How to Use Grok for Conference Live Monitoring: Extract Event Insights and Identify Networking Opportunities in Real Time How-To How to Use Grok for Influencer Marketing: Discover, Vet, and Track Influencer Partnerships Using Real X/Twitter Data How-To How to Use Grok for Job Market Analysis: Track Industry Hiring Trends, Layoff Signals, and Salary Discussions on X/Twitter How-To How to Use Grok for Investor Relations: Track Earnings Sentiment, Analyst Reactions, and Shareholder Concerns in Real Time How-To How to Use Grok for Recruitment and Talent Intelligence: Identifying Hiring Signals from X/Twitter Data How-To How to Use Grok for Startup Fundraising Intelligence: Track Investor Sentiment, VC Activity, and Funding Trends on X/Twitter How-To How to Use Grok for Regulatory Compliance Monitoring: Real-Time Policy Tracking Across Industries How-To NotebookLM Best Practices for Financial Analysts: Due Diligence, Investment Research & Risk Factor Analysis Across SEC Filings Best Practices NotebookLM Best Practices for Teachers: Build Curriculum-Aligned Lesson Plans, Study Guides, and Assessment Materials from Your Own Resources Best Practices NotebookLM Case Study: How an Insurance Company Built a Claims Processing Training System That Cut Errors by 35% Case Study