• For the New York Times, Christine Zhang counted Trump’s comments about Jerome Powell, chair of the Federal Reserve, over the president’s first and current term. They are marked as compliments (not so many now) and insults (many more now):

    The New York Times analyzed what Mr. Trump has said about Mr. Powell on social media and in public interviews and press conferences during his first and second terms. In his second term, Mr. Trump has targeted Mr. Powell on approximately 43 separate occasions, all of them since April. The attacks have far outpaced those made during the entirety of Mr. Trump’s first four years in office, over which he critiqued Mr. Powell on about 30 occasions.

  • As work evolves with automation seemingly everywhere, one might expect to see shifts in the workforce. Jobs more likely affected by generative AI should show greater changes. Financial Times, turning attention to incoming workers, shows that such a correlation has not appeared yet.

    The best bit comes outside the article by FT journalist Clara Murray. An AI-generated site shows an attempt to summarize the article, but unable to get around the FT paywall, the generated article is just a summary of the FT subscription pitch.

  • Surprise, commonly used chatbots show gender bias when users ask for salary negotiation advice. Researchers Sorokovikova and Chizhov et al. found that salary suggestions for females were lower than for males. The researchers ran experiments against sex, ethnicity, and job type with five LLMs: GPT-4o Mini, Claude, Llama, Qwen, and Mixtral.

    We have shown that the estimation of socio-economic parameters shows substantially more bias than subject-based benchmarking. Furthermore, such a setup is closer to a real conversation with an AI assistant. In the era of memory-based AI assistants, the risk of persona-based LLM bias becomes fundamental. Therefore, we highlight the need for proper debiasing method development and suggest pay gap as one of reliable measures of bias in LLMs.

    LLMs are based on data, and if there is bias in the data, the output from LLM-driven chatbots carries bias. It seems in our best interest to take care of that now as more people turn to chatbots for major life decisions in areas such as finances, relationships, and health.

  • Members Only

    I am sorry in advance, but I have to talk through something you are probably tired of hearing about: AI. For data visualization.

  • For STAT, Olivia Goldhill reports on an often overlooked demographic.

    Meanwhile, another demographic has gone largely overlooked. The people most at risk from suicide aren’t those in crisis in adolescence or midlife, but men age 75 and older. Some 38.2 deaths per 100,000 among men age 75 to 84 are by suicide, which increases to 55.7 among those over 85, according to data from CDC — more than 16 times the suicide rate for women in the same age group. Researchers are calling for a public health effort, much like the one to treat youth mental health, to help address suicide in older men.

    Many attribute the recent declines in youth suicides to all the attention paid to the issue, and the ample resources devoted to it, said Mark Salzer, professor of social and behavioral sciences at Temple University. “The same intensive efforts have not been made for older adults where there is a belief among some that depression is a natural part of aging,” he told STAT. “It is not.”

    It is also interesting that the rate for older women, which is much lower than it is for men, goes the opposite direction showing a decreasing rate with age.

  • From Axios, a quick chart that shows consumer sentiment between those with annual income over $100,000 and those under $50,000. As you might expect, sentiment is higher (and positive) for those with more income, which is usually the case, but the difference between higher and lower income groups seems like it’s going to grow more.

  • For 404 Media, Joseph Cox describes an app, Mobile Fortify, that ICE officers can use to identify private information by pointing a phone’s camera at someone’s face.

    Mobile Fortify differs from more well-known facial recognition tools, such as Clearview AI, because Clearview doesn’t necessarily have access to government databases. When a law enforcement officer uses Clearview, it checks a subject’s face against a massive archive of images Clearview has scraped from social media and the wider web. That can return someone’s name, potentially, and their social media presence. Mobile Fortify, meanwhile, takes someone’s identity and then burrows into DHS’s and other agencies’ databases to surface much more information about the person, and information that only the government may possess.

    The material indicates that Mobile Fortify may soon include data from commercial providers too. One section says that “currently, LexisNexis is not supported in the application.” LexisNexis is a massive data broker that collates data including peoples’ addresses, phone numbers, and associates. LexisNexis did not respond to a request for comment.

    I wonder what they’re going to do with that IRS and Medicaid data. Hm.

  • For the Associated Press, Kimberly Kindy and Amanda Seitz report:

    The database will reveal to ICE officials the names, addresses, birth dates, ethnic and racial information, as well as Social Security numbers for all people enrolled in Medicaid. The state and federally funded program provides health care coverage program for the poorest of people, including millions of children.

    The agreement does not allow ICE officials to download the data. Instead, they will be allowed to access it for a limited period from 9 a.m. to 5 p.m., Monday through Friday, until Sept. 9.

    I am sure ICE officials will abide by the agreement and definitely only access locations and Social Security numbers during working hours and will never download data for future use.

    This comes after news that ICE will access data from the Internal Revenue Service.

  • Medieval Murder Maps is a project that maps and tells the stories of homicides during the Middle Ages. Click a marker on the map and get the story, such as the one about “a drunk woman and a revenge killing outside a tavern” or when an “adolescent seeks sanctuary after killing his father’s adversary.”

    Each event is codified by weapon and type, and there are currently maps for London, York, and Oxford.

  • For WaPo Opinion, Catherine Rampell describes this year’s string of resource reductions for U.S. government statistical agencies.

    To be clear, administration officials do not appear to be overtly massaging numbers to reach their preferred conclusions — or “beating the data until it confesses,” as the saying goes. More often, officials are depriving agencies of resources necessary to crunch the numbers in the first place. Early retirements, “deferred resignation” offers, hiring freezes and haphazard budget cuts have made it difficult for our gold-standard statistical agencies to complete some of their core responsibilities, including some required by federal law.

    When data science was becoming a thing, people would often say that you can’t fix what you can’t measure. The premise was that we should collect data on all the things to make things better. I didn’t realize at the time that some people heard that and went a different direction.

  • Thousands of workers were wrongfully accused and prosecuted in the early 2000s, because of false data produce by a poor accounting system.

    Horizon, the information technology program at fault for the accounting errors, was created by Fujitsu, a Japanese company, under a contract with the British government. The report alleges that even before the program was rolled out in 1999, some Fujitsu employees knew that Horizon could produce false data. Fujitsu did not immediately respond to a request for comment submitted through the company’s website.

    […]

    Prosecutors relied on data from Horizon to bring criminal cases against the postal workers. Further reports from the inquiry are likely to detail the role of Fujitsu and the postal service’s top officials in the scandal.

    A recent report, as described by the New York Times, points to 13 deaths by suicide as a result of the errors. This happened a while ago, but holy cannoli, don’t take data at face value.

  • It is “flash flood month” in the United States, brought on by high temperature, water vapor, and air current, which leads to heavy rain. Where that rain goes varies by geography and terrain. For the New York Times, William B. Davis, Judson Jones, and Tim Wallace show the varying details of four major flood areas in July 2025.

  • For the ongoing federal layoffs and reversals, a tracker from CNN:

    CNN is tracking the evolving situation at federal offices in Washington and across the United States. Earlier this week, the Department of Veterans Affairs walked back its plans to conduct mass layoffs, so it is no longer included in the tracker below. This page will be updated as new reporting becomes available.

    Sourced from a wide collection of outlets and reporting, the tracker only includes firings and not administrative leave or those who took buyouts at the fork in the road.

  • Using the most recent data from the American Time Use Survey, which asked people what they did during a day in 2024, see how common each activity was for a given time of day, age, and sex.

  • Members Only

    Focus on the audience who matters and ignore the rest who only want to see data that validates narrow views.

  • In soccer, a free kick can be awarded after an opponent commits a foul. The ball is still and a wall of defenders stand in between the goal and the kicker to make scoring a greater challenge. The Washington Post breaks down the differing strategies of star players Lionel Messi and Cristiano Ronaldo.

    Often, all attention fixates on a lone player standing over the ball. Few have mastered this moment — of striking a free kick with perfect execution — like Lionel Messi and Cristiano Ronaldo.

    But while the outcome is often the same — a goal of spectacular dimension — the approach, technique and execution differ substantially.

    My soccer knowledge is sparse, so I learned something new. I like the flow from overview, to player, to kick, followed by video footage. It lets you appreciate each player’s intention and years of practice behind a quick action.

  • For ProPublica, William Turton, Christopher Bing, and Avi Asher-Schapiro report on a blueprint for a system that provides home addresses to the Department of Homeland Security.

    Taxpayer data is among the most confidential in the federal government and is protected by strict privacy laws, which have historically limited its transfer to law enforcement and other government agencies. Unauthorized disclosure of taxpayer return information is a felony that can carry a penalty of up to five years in prison.

    The system that the IRS is now creating would give ICE automated access to home addresses en masse, limiting the ability of IRS officials to consider the legality of transfers. IRS insiders who reviewed a copy of the blueprint said it could result in immigration agents raiding wrong or outdated addresses.

    This of course is just the beginning.

    I think that most people assume that the single-entity government already has access to all of our data, like how certain social media companies know a little too much about us.

    But there are barriers in between the multiple entities of government to protect individual privacy and freedoms. This is on purpose. This IRS system takes away a barrier and sets precedent for future systems of a single-entity government to surveil citizens.

  • Read enough children’s books with anthropomorphic animals and you might notice that some animals tend towards a gender like a male frog or a female cat. For the Pudding, Melanie Walsh, with Russell Samora, Michelle Pera-McGhee, and Jan Diehm, found out how often and why with an analysis of 300 books and a reader experiment.

    To find picture books that specifically feature anthropomorphic animals, we selected any book that had “animals” as one of its top Goodreads tags; any book that was tagged as “animals” in the Children’s Book Database; or any book that GPT-4o identified as featuring animals (after being prompted with its title, author, and description). We then manually evaluated every book and every representation of animal characters (for more on how we determined “anthropomorphic” animals, see below). We excluded anthologies and collections, like Grimm’s Fairy Tales.

    The story is framed into acts and an interactive to see all the books in the dataset, which is available on GitHub.

  • In Jacksonville, Florida, police arrested a man because AI facial recognition classified his face as a close match to a suspect’s. The police had the wrong person.

    Even though charges were dropped, it’s easy to see how a flawed probabilistic tool used by those who don’t fully understand uncertainty leads to poor results. What happens when law enforcement’s mistake is less obvious and the consequences grow more serious?
    Read More

  • Vaccination rates are still relatively high, but they need to be for herd immunity. For ProPublica, Duaa Eldeib and Patricia Callahan, with graphics by Lucas Waldron, highlight states that fell below the recommended threshold for measles over the past decade.

    At least 36 states have witnessed a drop in rates for at least one key vaccine from the 2013-14 to the 2023-24 school years. And half of states have seen an across-the-board decline in all four vaccination rates. Wisconsin, Utah and Alaska have experienced some of the most precipitous drops during that time, with declines of more than 10 percentage points in some cases.

    “There is a direct correlation between vaccination rates and vaccine-preventable disease outbreak rates,” said a spokesperson for the Utah Department of Health and Human Services. “Decreases in vaccination rates will likely lead to more outbreaks of vaccine-preventable diseases in Utah.”

    A state grid map with difference charts show decreasing rates. There is a thin margin for error.