TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Not Just for Educators: Lessons from InBloom’s Demise Related reading: InBloom Wilts Amid Privacy Backlash

rss_feed

""

""

How does one of the world’s most well-funded and successful education technology companies go the way of the dodo? For inBloom, much of the consternation and backlash and, ultimately, the death knell stemmed from concerns about children’s privacy. But inBloom’s story does not reside in solitude on a remote and distant island; rather, there are lessons that can be gleaned from their shuttering of doors that should inform plenty of other technology companies, privacy professionals, software and app developers, start-ups, legislators and regulators.

The story of inBloom brings together hazy lines separating the data controller from the data processor, the double-edged sword of transparency and the consequential, post-inBloom equity gap for less well-funded schools across the country.

These were just some of the many issues discussed last week at the IAPP Privacy Academy and CSA Congress in San Jose, CA. Panelists during “Educating the Educators: Privacy Lessons” passionately examined the backstory, rigors, complexities and lessons from this well-publicized incident of the company’s unraveling. Moderated by IAPP VP of Research and Education Omer Tene, the session included insight from inBloom Chief Privacy Officer Virginia Bartlett, CIPP/US, CIPT, and FunnyMonkey Founder and edtech blogger Bill Fitzgerald.

InBloom was a cloud-based, open-sourced platform that helped solve tech and software interoperability issues for school districts, schools, teachers, students and parents, said Bartlett. “This was a back-end solution,” she added, “we had no access to student data, no access to the encryption keys. This could be a cloud-based storage solution or an operator model.”

Tene suggested that inBloom was a data processor, not controller, but was blamed as though it was a controller.

“In most cases,” Bartlett added, “the data controller is an elected official.” Still, inBloom was constantly answering questions from parents, Bartlett said, adding, “even as we couldn’t see the data.”

Bartlett said, “One thing we heard frequently was: ‘What type of information was inBloom storing on me? I want access.’ That is absolutely their right, but who is responsible for making that information available?”

“It’s a huge question,” Fitzgerald pointed out.

Responsibility varies district by district, state by state and by regional authorities, he said.

“The data flow goes from student to teacher to school to district, which then stores student data for every kid in school. Plus classrooms integrate with different apps which have their own data trail, some covered by FERPA but others not. Then beyond the district level, data then goes to regional education agencies that provide tech assistance,” Fitzgerald explained. “Then it flows to the state for accountability reporting.” But that’s not all, he said. States then go to the federal level for accountability “and there’s data storage at each level. So you have warehousing and security needs at multiple levels, so it gets very complex.”

From the start, inBloom had tried to be up front about the complexity and nature of its services. So it is perhaps ironic that in an age when transparency is presented as a silver bullet, the beginning of the company’s downfall came as a result of that very transparency. Early on, inBloom disclosed which states it had closed deals with as well as the nature and limits of its privacy and security obligations. Consequently, in a March 3 report for Reuters, reporter Stephanie Simon wrote, “While inBloom pledges to guard the data tightly, its own privacy policy states that it ‘cannot guarantee the security of the information stored … or that the information will not be intercepted when it is transmitted'.”

Referring to companies from every sector of the economy, Bartlett said, “We all know we can’t guarantee security … We were just transparent about it.

Fitzgerald went further, saying perhaps the company was too transparent. “You can learn a lot about an organization’s mission, intent, growth and strategy through its privacy policy. It’s incredibly revealing. What inBloom did was good," providing "useful and accurate information for other people already in the space.” But, he said, “there wasn’t the scaffolding in place” to help people understand what information was and was not there.

It didn't help inBloom that critics associated it with the roll out of the controversial Common Core education policy. Fitzgerald mentioned this was, in part, due to the funding trifecta of Gates, Carnegie and Joel Klein, all staunch Common Core supporters.

Bartlett, who joined inBloom after the “firestorm” had already started, said, “The tragedy of what we did was we became less and less transparent” because of the outcry. “As a CPO, that really hurt. We wanted to be transparent.”

She also said that what happened to inBloom is not unique to the tech sector. “There is a moment when public understanding and privacy policies need to catch up with the technology,” Bartlett said. Clearly, this was one such case.

Bartlett discussed her personal relationship with edtech. “I have adopted children with special needs,” she said. “They have tons of data and I have no way to track that data or who’s handling it. I think there is a lifetime of opportunity here for technology engineers to design for districts and partners to design data flows and access.” She added, “It’s a leap of faith trusting all these personalized learning services.” But, she pointed out, the benefits are significant.

There’s irony here too, as Tene pointed out, since the New York law that eventually expelled inBloom from the state prohibited the provision of data dashboards by vendors.

And the loss of such benefits will be felt more acutely by lesser-funded schools, Tene said: “There’s an equity gap. Richer schools have already acquired technological solutions, and inBloom was going to provide lesser-funded schools with such technology.” This was one demonstrable loss for communities across the country.

Fitzgerald said inBloom could have done things differently. “If you’re serious about an open-sourced component to your services,” he said, “reach out to developers, prepare explanations and talk to your clients.” He noted that inBloom’s clients were the school districts and not the parents who were complaining. “The most affected were the kids and parents,” he said, so they should have gotten out there and talked with them and committed to some market research. “Have a PR team that is versed in quick reaction with tools that can counter misinformation.”

He was stunned by inBloom’s silence during the height of the campaign against it, and that vacuum, he pointed out, was filled by others. “There were things they did wrong,” he added, “but the things they’re getting criticized for are not wrong.”

Bartlett continued, “Having been in the firing line, it’s hard for me to not say that inBloom did everything right. There are things inBloom could have done better and there are lessons learned that all of us can take back.”

Ultimately, she said, “We need to design to the sense of the loss of control of data. The right people need to be in the room when designing something.”

Advice, perhaps, that can be taken by any organization designing new products and services.

2 Comments

If you want to comment on this post, you need to login.

  • comment C • Sep 28, 2014
    As a parent I have a response as to why inBloom failed.  The data was not inBloom's to collect in the first place.  No matter how much money is put into packaging a PR campaign, fact remains that this data does not stay within the school system, cloud storage is not secure and pii is shared with third parties.  Further, anonymous student ID numbers can be re-ID'd, data is sold.   75% of districts do not inform parents of data collection, COPPA and FERPA are bypassed,  the data is not secure and parents cannot opt out.   Data collection is done without parental consent and in many cases against parents' wishes.  (see citations below)
    
     inBloom and data mining: A Common Core cousin http://deutsch29.wordpress.com/2014/01/08/inbloom-and-data-mining-a-common-core-cousin/
    
    De-identification doesn't work https://freedom-to-tinker.com/blog/randomwalker/no-silver-bullet-de-identification-still-doesnt-work/
    
    PARCC and SBAC States Agree to Deliver Student-level Data to USDOE  http://deutsch29.wordpress.com/2014/09/09/parcc-and-sbac-states-agree-to-deliver-student-level-data-to-usdoe/     
    
    Congressional testimony on student data collection http://edworkforce.house.gov/uploadedfiles/reidenberg_testimony_final.pdf
    
    COGS IN THE MACHINE: Big Data, Common Core, and National Testing                http://heartland.org/policy-documents/cogs-machine-big-data-common-core-and-national-testing
    
    DATA AS THE NEW CURRENCY 
    http://dupress.com/articles/data-as-the-new-currency/
  • comment C • Sep 28, 2014
    As a parent I can tell you what led to inBloom's demise: The data were not theirs to collect in the first place. No amount of money spent on PR campaigns and commercials can change the fact that the data does not stay in schools,  data is shared with state, federal agencies and third party vendors. Data is sold.  Further, the data is not safe; cloud security is often breached, much of the data is pii.  Anonymous data is not truly anonymous and can be re-ID'd.  75% of districts don't tell parents that data is collected and shared.   Data collection happens without parental consent and in many cases, against parents' wishes.  The reason inBloom and others fail: parents do not want their children (minors) to be Human Capital or to be profiled.  As soon as you collect data on someone, you are putting them into a category, a dangerous data paradox.  (see citations below) 
    
    PARCC and SBAC States Agree to Deliver Student-level Data to USDOE  http://deutsch29.wordpress.com/2014/09/09/parcc-and-sbac-states-agree-to-deliver-student-level-data-to-usdoe/     
    Congressional testimony on student data http://edworkforce.house.gov/uploadedfiles/reidenberg_testimony_final.pdf
    
    COGS IN THE MACHINE: Big Data, Common Core, and National Testing http://heartland.org/policy-documents/cogs-machine-big-data-common-core-and-national-testing
    No silver bullet: De-identification still doesn't  work https://freedom-to-tinker.com/blog/randomwalker/no-silver-bullet-de-identification-still-doesnt-work/
    inBloom and Data Mining: A Common Core Cousin
    http://deutsch29.wordpress.com/2014/01/08/inbloom-and-data-mining-a-common-core-cousin/
    Data as the new currency
    http://dupress.com/articles/data-as-the-new-currency/