Are Education Apps Putting Student Privacy at Risk?
Do you use YouTube in the classroom? Many teachers do; there are tens of thousands of great educational videos uploaded to YouTube on topics ranging from phonics to particle physics.
That's why a recent finding by the Federal Trade Commission (FTC) and the State of New York is so troubling. Google, the parent company for YouTube, has been fined $170 million to settle charges that the platform has been collecting personal data from children without parental consent.
A History of Privacy Violations at Google
The YouTube case isn't a first for Google. The company has a history of playing fast and loose with children's data privacy. Previous violations led to a 2011 agreement with the FTC that bars Google from misrepresenting its privacy policy and requires the company to submit to regular, independent privacy audits through 2031. In 2012, they were fined $22.5 million for violating that agreement with improper tracking cookies on the Apple Safari browser.
Google's lack of concern for children's privacy is especially egregious when it comes to applications that are directly targeted to children and educators. YouTube hosts a multitude of videos that are attractive to children. It's children's app, YouTube Kids, does have some basic controls to gather parental consent and try to ensure that children are not signing in by themselves. But there are hundreds of channels on YouTube with content aimed at young children—it strains credibility to say that popular channels like Mother Goose Club or Alphablocks are targeted to a "13 and up" audience. Both YouTube and YouTube Kids track viewers with personally identifiable device information. On YouTube, that information can be sold to advertisers.
Google's track record with student privacy on their education suite, G Suite for Education, is also questionable. In the wake of a 2013 lawsuit, a Google spokesperson confirmed that the company scans and indexes emails of all Google Apps for Education users to use for a variety of purposes, including advertising. These processes cannot be turned off. While ads are not served to students while they are logged into their G Suite for Education accounts, Google has been cagey about how it is using that information. It would be possible for Google to build profiles of students and serve ads based on their G Suite for Education activity while they are navigating the internet outside of the education portal.
With more than 70 million active G Suite for Education accounts, Google is one of the largest technology players in classrooms in the U.S. and globally. The new YouTube ruling is a worrying signal that they are still not taking children's privacy as seriously as they should.
Beyond Google: A Half a Million Educational Apps
Of course, Google is not the only problematic player in the classroom. There are more than 500,000 educational apps found in the Google Play and Apple App Stores. While a large number of these are geared for independent use at home, many teachers now incorporate a variety of apps into their classrooms.
Many of these apps are great! Teachers may use them as fun supplemental instructional resources, extra skills practice, or to individualize instruction for students. However, most of them are built by tech start-ups with little background in children's privacy laws. Free apps are most likely to gather user data and track children's activities for the purpose of delivering targeted ads. However, even paid apps have been discovered to be tracking and using children's data in troubling ways, either with intent or through ignorance of best practices in children's privacy protection.
Some of these apps collect personally identifiable device information and track precise location information, creating serious privacy and potential safety concerns for children. This data may be sold to third-party companies for the purpose of building profiles for targeted advertising. Google has provided many developers with a loophole by allowing them to tag apps as targeted to "families in general" rather than specifically to children. It also does not provide much oversight to ensure that apps specifically targeted to children are following the rules. Apple has made attempts to police the children's section of their App Store, but an investigation by The New York Times still identified troubling data harvesting practices on some of their most popular children's apps.
Apps made specifically for education aren't immune, either. "Freemium" software solutions such as Edmodo may make money by selling advertising directly on the platform or by selling student or teacher information to third-party advertisers. A report by Fordham University found that some companies have made lists of student data that includes characteristics such as location, ethnicity, affluence, religion, lifestyle and even social characteristics such as "awkwardness."
Who is Responsible for Protecting Student Privacy?
When it comes to at-home use of technology, a case can be made that the onus is on parents to educate themselves about the apps and websites their children are using and take steps to protect their privacy. If parents are the ones overseeing the apps their children download and the websites they are allowed to visit, perhaps this could be construed as providing "permission" for any resulting data tracking. Making this argument, however, would seem to require quite a bit more transparency on the part of app designers and website creators on how data is collected, used and sold so that parents can make informed decisions.
When is comes to schools, the ruling is much clearer: schools are not authorized to grant permission to third party apps and websites for collection of student data. The Family Educational Rights and Privacy Act (FERPA) puts the responsibility for protecting student data privacy squarely on schools and districts, not on third-party vendors. That means that schools, not the app creators, are ultimately responsible for complying with student data privacy laws. Schools must also comply with the Children's Online Privacy Protection Act (COPPA), which provides extra protection for children under the age of 13.
Schools have a special responsibility for protecting student data because use of educational apps and websites may be considered compulsory rather than voluntary. When use of an educational resource is an essential part of the educational experience or is part of a student's grade, students and parents may not have the opportunity to opt out if they have privacy concerns. This means that schools must be extra careful in selecting digital educational resources for students of all ages, not just those under 13.
That's why eChalk will never mine or sell student data. We believe that companies providing digital tools and apps to schools should be completely transparent about the ways that student data is collected, used and tracked, and who else may have access to that data. We also believe that sales models for digital tools and apps should make it clear how the company makes money and if selling student data is part of the model. Too often, the hidden price of free software is a loss of privacy for the students schools are bound to protect.