In summary
During early COVID lockdowns, a teen worked with his family to raise $1.8 million in venture funding and built a web filter with a team of data scientists and psychologists that he now hopes will help students safely surf the web.
Like most kids, Aahil Valliani has been frustrated by the filters that his school uses to block inappropriate websites. Often, he has no idea why certain sites are blocked, especially when his web browsing is tied to his schoolwork.
Many students in this situation find a way around their districts’ web filters. They access the internet on their phones instead, or use proxy servers or virtual private networks to essentially access a different, unfiltered internet. Aahil, searching for a more systemic solution, teamed up with his younger brother and father to start a company called Safe Kids, raise almost $2 million in venture funding, and design a better filter.
As The Markup, which is part of CalMatters, reported in April, almost all schools filter the web to comply with the federal Children’s Internet Protection Act and qualify for discounted internet access, among other things. Most schools The Markup examined used filters that sort all websites into categories and block entire categories at once. Others scan webpages for certain off-limits keywords, blocking websites on which they appear regardless of the context. In both cases, the filters are blunt tools that result in overblocking and sometimes keep kids from information about politicized topics like sex education and LGBTQ resources.
Aahil, now 17, points out that schools’ overly strict controls disappear as soon as kids graduate. “That’s a recipe for disaster,” he said. Kids, he contends, need to learn how to make good choices about how to use the internet safely when trusted adults are nearby so they are ready to make good decisions on their own later.
The Safe Kids filter turns web blocking into a teachable moment, explaining why sites are blocked and nudging students to stay away from them of their own accord. It uses artificial intelligence to assess the intent of a student’s search, reducing the number of blocks students see while conducting legitimate academic research. One example: if a student searches for Civil War rifles for a class assignment, Safe Kids would allow it. If a student tries to shop for an AK-47, it wouldn’t. Other filters would block both.
The filter also keeps student browsing data private, storing only categories of websites accessed, not URLs or search terms themselves. And it works through a Chrome browser extension, which means students can’t simply get around it with a proxy server or VPN while using that browser.
Safe Kids got its start during the early COVID-19 lockdowns. Sitting around the dinner table with his father, a tech entrepreneur; his mother, a self-employed fashion designer; and his younger brother Zohran, a budding computer scientist, Aahil got his family to strategize how to help all the kids getting sucked into dark corners of the web and battling the mental health consequences of their internet use.
Their idea, building off of the invasive and ineffective filters the brothers saw in school, essentially puts better training wheels on the internet. Aahil said his father did a bit of hand-holding in these early days, helping find board members and angel investors, as well as the data scientists who would train the AI machine learning model behind the filter and psychologists who could craft and test the filter’s hallmark pop-ups directing students toward more appropriate browsing. The company also spent time and money getting their designs patented. Aahil has three patents under his name and Safe Kids has five.
As Aahil and his family were preparing to chase seed funding for Safe Kids, the ACLU of Northern California was demanding the Fresno Unified School District stop using a product called Gaggle, which districts use to monitor students’ internet use, block potentially harmful content, and step in if student browsing patterns indicate they may need mental health supports. The problem, according to ACLU attorneys, was that Gaggle amounted to intrusive surveillance, trampling on students’ privacy and free speech rights.
The Electronic Frontier Foundation levied similar accusations against another web filter called GoGuardian after getting records from 10 school districts, including three in California, that revealed the extent of the software’s blocking, tracking and flagging of student internet use during the 2022-23 school year, when Aahil was piloting Safe Kids. Jason Kelley, a lead researcher on EFF’s GoGuardian investigation, The Red Flag Machine, looked into Safe Kids in response to an inquiry by The Markup. Accustomed to pointing out how bad filters are, he offered surprised praise for Safe Kids, commending its focus on privacy, its open source code that offers transparency about its model, and its context-specific blocking.
“This is, really, I think, an improved option for all the things that we are generally concerned about,” Kelley said.
So far, Safe Kids has not been able to break into the school market. Still, Aahil hopes to one day sign a contract with a school district, and he is marketing to parents in the meantime, offering them a way to put guardrails on their kids’ home internet use. While Safe Kids started out charging for its filter, Aahil said an open source, free version will be released next month.
One of the company patents is for a “pause, reflect, and redirect” method that leans on child psychology to teach kids healthy browsing habits when they try to access an inappropriate website.
“When kids go to a site the first time, we consider that a mistake,” Aahil said. “We tell kids why it’s not good for them and kids can make a choice.”
For example, if a student tries to play games during a lesson, a pop-up would say, “This isn’t schoolwork, is it?” Students can click a “take me back” button or “tell me more” link to get more information about why a given site is blocked. When students repeatedly try to access inappropriate content, their browsing is further restricted until they address the issue with an adult. If that content indicates a student might be in crisis, the user is advised to get help from an adult, and in a school setting, a staff member would get an automated alert.
The teen expects to keep building the company, even as he shifts his focus to college admissions this fall. A rising senior at the selective Thomas Jefferson High School for Science and Technology in Alexandria, Virginia, one of the nation’s best public high schools, Aahil plans to major in business or economics and make a career out of entrepreneurship.
Safe Kids stands out in a web filtering market where products’ blunt restrictions on the web have barely become more sophisticated over the last 25 years.
Nancy Willard, director of Embrace Civility LLC, has worked on issues of youth online safety since the mid-1990s. She submitted testimony for the congressional hearings that resulted in passage of the Children’s Internet Protection Act in 2000 and describes the filtering company representatives that showed up as snake oil salesmen, selling a technology that addresses a symptom, not the root of a problem.
“We need to prepare kids to manage themselves,” Willard said. When traditional filters block certain websites with no explanation, kids don’t learn anything, and they’re often tempted to just circumvent the software.
“This approach helps increase student understanding, and hopefully there’s a way also in the instructional aspects (to increase) their skills,” she said about Safe Kids.
Students on Chromebooks in particular can’t circumvent Safe Kids and its design aims to keep them from wanting to. Now Aahil and his family just need to find buyers.
Kelley said he’s not surprised Safe Kids hasn’t been able to yet, given the “hardening” of school security and student safety efforts over the last decade. “We’ve gone from having cameras and some pretty standard filters to having metal detectors, and locked doors, and biometrics, and vape detectors in the bathrooms, and these much more strict filters and content moderating control software,” he said, “and all this is hard to undo.”