 Your new post is loading...
A distinction between cloud-based apps and cloud computing must be addressed. Cloud computing at an enterprise level, while argued against ad nauseam over the years, is generally considered to be a secure and cost-effective option for many businesses.
Even back in 2010, Microsoft said 70% of its team was working on things that were cloud-based or cloud-inspired, and the company projected that number would rise to 90% within a year. That was before we started relying on the cloud to store our most personal, private data.
According to Symantec, 89% of our Android apps and 39% of our iOS apps require access to private information. This risky use sends our data to cloud servers, to both amplify the performance of the application (think about the data needed for fitness apps) and store data for advertising demographics.
While 75% of American respondents believe it’s possible to have data privacy in today’s technology-driven world, a significant portion of respondents (one in five) worry about being able to maintain their privacy while online.
In order to make people feel better about giving you their data, you have to build trust.
Over 70% of U.S. respondents said they were more likely to do business with a company if they were offered the opportunity to delete their information. U.S. consumers also valued the ability to turn off location tracking, delete their browsing history, choose whom a company shares information with, and review the information that companies have about them.
Another key finding: about 40% of respondents, regardless of country, felt that their data is worth more than the free services they receive in exchange.
47% of Americans and 43% of Britons and Australians are willing to share personal data with CPG companies
Publicis Sapient writes, “On average, four in five people in all five countries say they know little to nothing about what companies do with the data they collect.”
A survey has recently conducted which was concluded with the results that without fear of being charged or accused of using their users data, the companies are still tracking their users. CRM Essentials conducted research that was commissioned by Zoho. Sadly, 62 percent of Canadian and US companies allow third-party tracking codes on their website and do not let their users know about this. Around 55 percent of companies claimed that they take great care of the data privacy policies of their users.
While aggregating data for the survey, the Zoho company took 1,416 business leaders of both small and huge companies into consideration. The act of unethically using the data of customers was common among a large majority of them.
The US Federal Trade Commission is “studying” social media’s use of private data. The agency seems particularly interested in how data is used for advertising—especially the data of minors.
It’s a bipartisan FTC study, but its timing is notable, with Washington prepping President #46. Does it herald a new era of regulation for apps and platforms?
Brands can do several things to enhance a personalized omnichannel experience without having to sacrifice data privacy. First and foremost is to keep customer data, especially PII and other sensitive data, within an organization’s own security perimeter — whether that’s on-premises or in the cloud. This is a basic if often misunderstood tenet. There’s a misconception that SaaS companies can offer indemnity for data loss, but indemnification does not pay for the loss of brand reputation or trust. The bottom line is that data is always at risk when it leaves a company’s perimeter, full stop. If you want maximum control, you don’t let it out of your security perimeter.
Beyond this, however, is bringing advanced levels of security into customer data. This is crucial, especially with the consumer’s web experience rapidly becoming the primary interaction with a brand. Exposing personal data to a website necessarily brings it closer to the edge of the security perimeter, increasing the risk of exposure and misuse. Homomorphic encryption minimizes the risk by essentially never de-crypting the data — even when in use.
Consumers are deeply concerned by data use and abuse. Protecting data is the new front line of retail. Brands that fail to inspire trust are being abandoned.
A worldwide survey found a third have already switched brands for data privacy reasons. Deloitte found 98 per cent of UK consumers are “concerned” by the way online brands use personal data. A further 85 per cent will avoid a brand if their data is not secure.
The good news is that brands who build trust with consumers will prosper. They'll build loyalty, and enjoy access to the data they require to develop products and market effectively. Olly Bray, Senior Partner at law firm RPC and noted tech commentator, puts it like this: “The only way brands will be able to meaningfully interact with consumers in the future is by building trust through transparency, human-centred design and ethical data use. We are working with our clients so they can take the lead on the issue. Crack this and your customers should be with you for life. Ignore it and they'll leave you for dust.”
Privacy is the yang to online’s yin – they are two forces that seem to oppose but actually can and should complement each other. Privacy isn’t a brand but it will have a big impact on them next year. Privacy concerns and regulatory and legal activity will increase since the newly created California Privacy Rights Act (CPRA) expands further the state’s landmark consumer privacy law, the California Consumer Privacy Act (CCPA), and is likely to be widely observed across the U.S. What’s more, Apple now requires all third-party developers to detail their app’s privacy information — and will deprecate from its devices the Identifier for Advertisers (IDFA) early next year, which means advertisers will no longer be able to identify, tailor messaging, and measure user-level behavior within iOS apps. As a result, brands will rely less on third-party and passively-collected customer data and pursue more “zero-party data,” which customers intentionally and proactively share with them. If done well, brands can use this trend to build trust and greater connection with customers – but that’s a big if.
In the survey of more than 30,000 consumers across 63 global markets, over 20% of respondents report having reduced or abandoned their use of a brand or company due to data privacy concerns. Moreover, 19% report having switched to or selected a competitor company for its better data policies.
"Overall, the results suggest a need—and opportunity—for companies to overcome consumers' skepticism and relieve their anxiety," said Denise Dahlhoff, Senior Researcher at The Conference Board. "Consumers' digital engagement has skyrocketed during the pandemic, making transparency about data practices more important than ever before."
- Marketers walk a fine when it comes to privacy, as 20% of global consumers reported they have abandoned or reduced their use of a brand over its data practices, according to findings from a new survey The Conference Board conducted in partnership with Nielsen.
- About one-fifth (19%) of consumers have switched to a competitor that adheres to what they perceive to be better data policies, the Consumers' Attitudes about Data Practices report found. Globally, 44% of respondents said they would forego personalized content, including brand messages, offers and experiences, if it would mean not having to share their personal information.
- Those figures are even sharper in the U.S., with more than half (57%) of consumers spurning personalization to preserve their privacy. As marketers adjust their strategies to account for policy changes from major platforms like Apple and Google and stricter data-privacy regulations, the research makes clear that they need to establish greater trust and transparency if they want to keep consumers engaged.
Respondents understand the significant benefits of a mature privacy program as organizations experience greater gains across every area measured including: increased employee privacy awareness, mitigating data breaches, greater consumer trust, reduced privacy complaints, quality and innovation, competitive advantage, and operational efficiency. Of note, more mature companies believe they experience the largest gain in reducing privacy complaints (30.3% higher than early stage respondents).
Differential privacy can be used to protect everyone’s personal data while gleaning useful information from it. Differential privacy disguises individuals’ information by randomly changing the lists of places they have visited, possibly by removing some locations and adding others. These introduced errors make it virtually impossible to compare people’s information and use the process of elimination to determine someone’s identity. Importantly, these random changes are small enough to ensure that the summary statistics – in this case, the most popular places – are accurate.
In practice, differential privacy isn’t perfect. The randomization process must be calibrated carefully. Too much randomness will make the summary statistics inaccurate. Too little will leave people vulnerable to being identified. Also, if the randomization takes place after everyone’s unaltered data has been collected, as is common in some versions of differential privacy, hackers may still be able to get at the original data.
Facebook’s FB Instagram is being investigated by Ireland's Data Protection Commission (DPC) over the handling of children’s personal data on the social media platform.
The regulatory body has launched two inquiries on Facebook after it received complaints from individuals that Instagram has made contact information on business accounts publicly visible to anyone accessing the application.
The first inquiry aims at investigating whether Facebook has a legal basis for processing children’s personal data and whether the company deploys adequate protection measures and restrictions on Instagram for children.
The second inquiry looks to establish whether Facebook has adhered to the regulator’s data protection requirements in relation to Instagram’s profile and account settings.
- The end of third-party cookies brings an opportunity for brands to build relationships of trust with their customers. Ultimately, cultivating trust comes down to how brands handle customer data, however they come by it.
- Part of adopting a new approach to engaging with data is considering when and how first-party personal information is collected. The “how” is an important foundational detail, as it may set the tone for the marketer’s relationship with the customer, determining whether or not a customer chooses to opt-in to share their information at all.
- Even after consent is obtained, customers and prospects have the right to change their minds. It’s an ongoing conversation between a customer and a brand, and so the ability to grant or revoke consent at multiple points in time is critical.
- Smart data use provides relevancy, and the brands that are positioned to thrive in a world without third-party cookies will likely be those that recognize a simple truth: You earn the right to be personal by first being relevant.
Ultimately, it’s up to the brand to use it to deliver better content and experiences and grow its customer relationships.
On 7 September 2020, the European Data Protection Board (“EDPB”) adopted draft guidelines on the targeting of social media users (the “Guidelines”). The Guidelines aim to clarify the roles and responsibilities of social media providers and “targeters” with regard to the processing of personal data for the purposes of targeting social media users.
Targeting services allow natural or legal persons (i.e., targeters) to communicate specific messages to the users of social media in order to advance commercial, political or other interests. The Guidelines state that the mechanisms social media providers can use to target users, as well as the underlying processing activities, may pose significant risks to users, including loss of control over their personal data, discrimination and exclusion as a result of targeting on the basis of special categories of personal data, and manipulation through misinformation. The Guidelines also raise specific concerns in relation to children.
What’s more is that 70% of Android apps that are currently being used have asked for dangerous permissions at least once.
It may sound like an expensive, time consuming process for brands, but it’s one that can be leveraged back to the customer as a differentiator. “Privacy is becoming the sixth P,” said Gerry Murray, Research Director, Marketing and Sales Technology for IDC.
In Murray’s opinion, there’s a powerful advantage for brands that involve customers in a process that serves them better instead of just grabbing as much data as they can from their customers’ phones. “It’s becoming increasingly apparent to customers, which brands are on which side of the spectrum,” he said. “Unless you’re a monopoly or in some kind of rare, single-source position in the marketplace, customers are going to prefer brands that steward their data.”
More than eight in 10 Australians consider organisations asking for personal information that doesn’t seem relevant to an interaction to be a misuse of their data, a new report has found. Seven in 10 respondents nominated privacy as a major concern for them, while 87 per cent wanted more control and choice over the collection and use of their personal information.
Specifically, 81 per cent of Australians said they considered an organisation asking them for personal information that doesn’t seem relevant to the purpose of the transaction and recording the types of websites they visit without their knowledge to be a misuse of their data.
Most Australians also believe they should have the right to ask a business to delete their personal information (84 per cent) and nearly eight in 10 want the right to seek compensation in the courts for a breach of privacy (78 per cent).
By 2023, 65% of the world’s population will have its personal data covered under modern privacy regulations, up from 10% in 2020, according to Gartner, Inc. “Security and risk management (SRM) leaders need to help their organization adapt their personal data handling practices without exposing the business to loss through fines or reputational damages.”
SRM leaders should adopt key capabilities that support increasing volume, variety and velocity of personal data by putting in place a three-stage technology-enabled privacy program: establish, maintain and evolve.
The establish stage includes foundational capabilities of a privacy management program. The maintain stage allows organizations to scale their privacy management programs. The evolve stage includes specialist tools that focus on reducing privacy risk with little or no impact on the data utility.
When a bank takes your information for a transaction, it goes through multiple stages of processing. While the transaction is being made, it is encrypted and is usually safe. But, when the data is stored, it is on a server somewhere. And this server is the weakest link on the chain. If a hacker gets in, it doesn’t matter how secure the transaction is, they can get your information.
When a transaction happens on the blockchain after you’ve set up a bitcoin wallet, the ledger has to be encrypted and verified by miners who then get paid out in some cryptocurrency. Because of this system, the transaction can’t be deleted or modified in any way or everybody on the blockchain will see it and that modification would need to be verified.
In other words, it can’t happen. It literally cannot be hacked. So your data is protected at all times when the transaction is made.
Doing away with third party cookies has been touted as “privacy enhancing” for consumers. However, years of research have shown that doing away with cookies, both 1P and 3P, does not necessarily increase privacy. This is because ad tech companies have found “work arounds” to continue to uniquely identify users, even without using any cookies.
Browser Parameters - This is called “fingerprinting.” Just like fingerprints are unique to each individual, digital fingerprints can be made from combinations of browser variables. For example, by gathering variables like the browser name and version, screen resolution, list of fonts and plugins, and IP address and location, companies can identify unique users with 99% accuracy.
Demographic Attributes - Adtech companies also claim that users’ privacy is already protected because no PII is collected or used. However, privacy researchers spotlight the lie of ‘anonymous’ data. “Researchers from two universities in Europe have published a method they say is able to correctly re-identify 99.98% of individuals in anonymized data sets with just 15 demographic attributes.”
Browsing Histories - In addition to being able to uniquely identify or re-identify users with browser variables or demographic attributes, new research from Mozilla shows that browsing histories can also be used to uniquely identify individual users.
How Common Is Fingerprinting? "We find that browser fingerprinting is now present on more than 10% of the top-100K websites and over a quarter of the top-10K websites," the research team said.
Our research found that: _ 88% of Americans feel they ultimately own any personal data they give to a company. _ 97% believe it’s their right to have access to their personal data. _ 88% are frustrated by the fact that they don’t have control over their personal data and they wish the process of retrieving it from companies were easier. _ 65% are simply curious to know what information companies have on file about them. _ 65% of those surveyed want access to their personal data so that they can choose what companies can and can’t collect, and 43% want this information deleted altogether. Two-in-five (40%) would even update their information. _ 56% want immediate access to their personal data (and 80% want it back in a timeframe of 24 hours or less). However, only a quarter (26%) think they would actually get their data instantly if they were to ask for it.
New information from Vice's Motherboard reveals that 12 California lawmakers have finally begun taking steps in understanding how the California DMV sells driver's data, as well as staging the building blocks to extend consumer protections to prevent what may be an egregious misuse of personal information.
The lawmakers are now requesting that the DMV reveal what kind of companies it has sold or provided data to, specifically inquiring if it has ever knowingly provided information to debt collectors, private investigators, data brokers, or law enforcement agencies. The California DMV had previously revealed to Vice that the companies requesting data "may include insurance companies, vehicle manufacturers, and prospective employers."
Further information requests involve the use of social security numbers, photos, licensee data of undocumented immigrants, and how the department handles requests for opt-outs.
It's estimated that the California DMV earns roughly $50 million each year from selling the personal information of Californians, but it isn't alone in doing so. Other states like Rhode Island have also partaken in this revenue generator, raking in hundreds of thousands of dollars using the same practices. Both states are a drop in the bucket compared to Florida, which brought in $77 million in 2017.
You might be asking yourself, just how easy is it to obtain someone else’s personal information, documents, account details?
We certainly were.
To see just how prevalent such items of personal data are being listed, and at what price, we sent our researchers on a data-gathering mission into the dark web.
Only 22% of Americans report “often” or “always” reading online privacy policies, and that’s solely for websites which require browsers to affirmatively agree to a privacy policy (i.e., flashing a pop-up with some form of “check the box” affirmation). This does not engender much confidence that Americans are actively seeking out and consenting to the privacy policies embedded within the myriad of websites they visit on a daily basis. And who can blame them – a 2008 study estimated it would take 244 hours each year to read every privacy policy in full for all the websites an average web browser visited annually.
So note the structural framework of U.S. Sen. Sherrod Brown’s (D-Ohio) Data Accountability and Transparency Act of 2020 (DATA 2020): rather than maintaining the permissive data privacy legal framework which allows data processors to manage consumer personal data largely as they see fit, so long as they disclose their intentions in a lengthy privacy policy (which, as we’ve established, the vast majority of their consumers will never actually read), Sen. Brown instead suggests a restrictive legal framework that will dictate, by statute, when and how data processors may use consumer’s personal data, and to what extent.
"The decision to kill off third-party cookies is widely regarded as a necessary evil, but who gets to determine their replacement? The uncertainty has prompted some companies to develop turnkey solutions that are privacy compliant. Google—whose decision to eliminate third-party cookies from Chrome by 2022 created the vacuum—also has a solution, as does ad retargeting kingpin Criteo. Whichever technology emerges, it will address critical advertiser capabilities including ad targeting, frequency capping, user privacy and attribution. And it will enable an enormous advertising market; last year, third-party cookies helped fuel nearly 30 percent, or $38 billion, of all U.S. digital ad spend."
|
|
The weakest link to any digital endeavor is data security, and it will continue until there's a broad-based effort to clamp down on privacy.
Collect zero party data from your consumers NOW through marteq.io's FREE pilot program. Contact joe@marteq.io to qualify. #martech #marketing