
The crime & method. Reported in August 2025, an elderly couple received a call from someone who sounded like their grandson, claiming he’d been in a serious crash and jailed for DUI after hitting a pregnant woman. A supposed “official” then demanded $9,000 in bail, arranged an Uber/Lyft pickup to the bank, and later pressed for another $9,000. The station noted the use of AI to mimic the grandson’s voice and the practice of scraping social media to generate a convincing clone. WGAL
The criminals. The driver was likely an unwitting gig worker sent to shuttle the victim; surveillance footage was turned over to police. This setup—rideshare to bank + courier/cash handoff—has become common because it speeds up the cash-out and minimizes the scammers’ exposure. WGAL
The victims & losses. The couple withdrew $18,000 in two transactions under intense time pressure. Their daughter later spoke to media to warn other families. WGAL
Judicial history. As of the report, police were investigating. Parallel to this, federal indictments have targeted cross-border rings behind $21M+ “grandparent” scams using impersonation scripts, fake attorneys, couriers, and crypto—demonstrating the scale and organization behind many of these calls. ICE
Lesson for parents. Any script that includes a crash + arrest + secrecy + immediate cash/ride-share warrants a hard stop. Call your child directly; if unreachable, conference in another family member and call the local jail/court (independently sourced number). Banks will often intervene if you say “I think I’m being scammed”—let them help. And report attempts to your police, state AG, and the FTC. Consumer Advice
(This report is a collaboration with ChatGPT-5.)
Filed under: Uncategorized |




















































































































































































































































Leave a comment