This is a story of a politician who cried “hacker” after a reporter informed a state agency that sensitive information was embedded in their website’s HTML source code1. While we wish this was a joke or fictional story it, unfortunately, is not. If the state of Missouri does move forward with the prosecution this state action would sound the alarm for researchers and reporters resulting in a chilling effect on the practice of responsible reporting.
In a sense, this recommendation is a kind of abbreviation of the key things that our specifications test for. And you’ll be able to see that soon as the Me2B Safe Website Specification for Respectful Technology is currently in the membership review stage of the approval process.
When we started drafting the Respectful Tech Specification a couple of years ago, it was immediately obvious that we didn’t have an adequate vocabulary to describe personal experiences in the digital world—never mind measure them.
The Me2B Deals or transactions that occur online typically involve three types of “currency”: money, attention or data. […] What sets online data monetization apart from the other two currencies is that often, customers have no idea what they are paying with – or that they are paying at all.
our relationship with connected technology includes a set of “hidden affiliates” (third party integrations) that most of us are not aware of. This guide describes how these relationships – conscious or not – emerge as we interact with digital technologies.
This real life social context is currently missing in both existing privacy regulation and in industry standards models for ethical technology […] Our model helps course-correct connected technology by pinpointing how the digital Me2B experience deviates from important social behavioral norms.
This guide provides examples of common Commitments and Deals, and shows how they map to the stages of a Me2B Lifecycle. It also reflects social norms for being anonymous, recognized, or known at each stage.
The Me2B Respectful Tech Specification measures technology behavior against 10 attributes that respectful Me2B Commitments should possess. These attributes represent how technology should treat us and our data at every step along the Me2B Relationship Lifecycle.
Our personal data flows do not start light and increase with time and trust. Instead, a firehose of personal information is released – and shared with a host of unseen third parties – as soon as we open an app or website. Me2BA’s Respectful Tech Specification V.1 is largely focused on testing for these invisible parallel dataverse data flows.
Twenty-five quintillion bytes of data are generated every day. That’s 25,000,000,000,000,000,000. In this era of data abundance, it’s easy to think of these bytes as a panacea – informing policies and spurring activities to address the pandemic, climate change or gender inequality – but without the right systems in place, we cannot realize the full potential of data to advance a sustainable, equit
TLDR: The Me2B Alliance believes apps including the AskingPoint SDK should be safe from malicious redirects or other exploits.
describes Me2BA’s approach to respectful technology behavior and discusses the Alliance’s work in standards development and independent testing. The conversation touches on the broader issues of our evolving and personal relationships with technology products and services, and the potential for respectful behavior to provide a deeper and better level of engagement, to the benefit of individuals and businesses alike.
The current version focuses on mobile apps and websites and encompasses only a portion of the harms outlined in the complete Me2B Digital Harms Dictionary. As the safe specification evolves subsequent versions will grow to include more of the harms identified in the Me2B Digital Harms Dictionary.
It was three years in the making, and this is how we got here.
We are excited to announce the Me2B Alliance is now Internet Safety Labs. We’ve changed our name but not our core mission.
We think that focusing on Facebook’s surveillance advertising is a good step in the right direction. However, there are several other significant threats to kids out there. In particular, Google’s YouTube is used by 69% of kids in the United States today, who reportedly spend approximately 1.5 hours a day on the app