Summarise the response in the Discussion thread in the attached file using the template below
 
Discussion Thread 1
Title:
… Article Summary …
… Summary of Individual Postings…
… Final Opinion…

Discussion Thread 1 :

Articlle Title: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

In the article “Your Deleted TikTok Content Can Still Be Used Against You By the FBI”, Thomas Brewster details how the FBI used old cellular tower data to link seven bank robberies in five states to a suspect’s phone number. The article furthers that even if data is deleted by owners, it is attainable by the FBI for use depending on the platform (Facebook, Google, ect). In this case, TikTok does not even have a concrete policy for when data deletion occurs on their servers if the user deletes it.
I find it interesting that the FBI didn’t even have to use five separate warrants to operate across all the states. The internet and data is borderless, the law jurisdiction is not. All this is occuring as Big Tech is currently under scrutiny in Washington. How do you feel about data harvesting and how governments and companies use your data?

Response:

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

This is certainly a large discussion being had, and I think we as consumers don’t even understand the full gravity of data collection happening and how it is used. When you read the “terms” and agree to policies on sites, does anyone read that and understand how it will impact them? I think every consumer needs to be aware of the risks in place and that every time you give or enter information on the internet, there is always a chance of a breach. Now there are companies that obviously pour a lot of resources into security, but there are probably many that do not and meet the bare minimum expectation. Isn’t big brother always watching?
I personally decline to give phone numbers, email addresses, etc., when businesses ask for it and I think sometimes they think I’m rude. However, I’m sure most of us in this class realize the risk of everytime you provide that information it goes into someone’s database that is susceptible or is maybe sold in some cases for “marketing” purposes. With pretty much all major companies utilizing information systems it is almost impossible to completely keep your data secure. Have you ever had your data stolen or an account hacked? I have, luckily my bank caught it quickly.

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

I find the point you bring up about the usage of search warrants in these cases very interesting as well. I am not by any means a lawyer or practiced of the sort, but wouldn’t it seem reasonable to have some sort of necessary search law to reinforce data security? I think it is ridiculous that law enforcement can retrieve information regarding your personal engagements even if you are not able to retain access to it. I also think that each of the technical “owners” of the data should have some sort of lawful agreement to secure users’ data regardless if the deletion policy has occurred. This would ensure that at every step law enforcement would need to get some sort of search warrant to proceed.
I feel like I am another one of those people who hates the idea of stored information of myself, but always reaps the benefits of convenience every time. I normally don’t approve of applications using my data, but every now and then I do approve of it. Just to allow those applications to have the full ability to use my data to enhance my convience and applicability in the app itself. Would I do this for every application, web site, etc.? Of course not, but saying that it doesn’t somewhat enhance the user interaction would be a lie coming out of my own mouth.

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

This is an interesting situation! At a time when Government has even more access to all our data through registration, and surveillance and is currently trying to pass new laws giving them greater legal access to information. When accessibility increases the security of that data decreases whether it be through private industry or government. Neither of which I personally believe is a good thing. The government however will try to make everyone believe it is for their own good.
Data mining is increasing at an accelerated rate with all the online quizzes, questionnaires and other programs where people willingly give up information in the name of fun without understanding what they are really doing. How many tattoos do you have? What is your first car? What was your high school mascot? All questions are placed on the fun quizzes online, and quite often the same security questions for account access.
While data mining is invasive, the data is on the internet forever and the consequences of that information may have repercussions years later. The user is always responsible to for freely documentation to be accessible for those to data mining. Education awareness and user knowledge needs to be promoted.

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

Hi,
Interesting article. Data harvesting laws need to be set in stone to prevent companies and even the government from making use of private data. The more the borderlessness of the internet continues to be used as a means to investigate private data of users, the more users will be in fear of technology, the greater the risk of manipulating whole populations with intelligence from data.

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

As technology advances, it is easier for the FBI and other law enforcement agencies to retrieve deleted data. Computer or mobile forensics investigations are usually carried out when the information contained in the devices is crucial to a case. Investigators sometimes use physical acquisitions to obtain deleted data, which involves using special tools to make a copy of the storage data and dumping it into a file. Physical acquisition is useful in recovering lost or deleted text messages.

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

I found your post to be very interesting and informative because it you would think that tiktok would have a set policy whe it comes to deleting data, especially when it involves criminal activity that can be traceabe to track down a suspect. Tiktok being a foreign owned company can really make it a hard time for american authorities to even obtain access if they needed to look into the platform for a potential case. It is also just not very reliable because you would want to be able to track down deleted content incase of any user complaints arise. Overall, I found your post to be very informative and also to help raise awareness about this issue on the application.

RE: Your Deleted TikTok Content Can Still Be Used Against You By The FBI | June 15 | Forbes

I think this is a very big topic not many people really understand. We blatantly give our personal information(Phone numbers, emails, etc) to companies without reading the terms and conditions or what they plan on doing with that data because of that, our data is everywhere on the internet, and there is someone out that, that is going to break into that database with your information, and then sell it on the dark web for not a lot of money, I think there need to be more laws around privacy and what companies can do with your data, and what we as individuals can do to control the data that companies have. I think GDPR is well in the right direction and something like that needs to be brought to the United states.

Discussion Thread 2

How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade Is Overturned

The sharing of personal information has become a critical and hot top in a variety of settings including the boardroom, government offices, Supreme courts, and the kitchen table. Nicole Nguyen and Cordilia James’ recent article, “How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade is Overturned” was published in the Wall Street Journal yesterday. This article highlights the importance of data privacy using a “period and ovulation predictions and symptom tracking” app called Flo as an example to drive home the implications of the supreme court overturning Roe v. Wade. Like many other apps, many users utilize Flo because of its convenience and additional insights offered, compared to the manual alternative of tracking menstrual cycles manually. However, the writers ask: “how much do you know about the ways apps and trackers collect, store—and sometimes share—your fertility and menstrual-cycle data?… Phone and app data continue to be shared and sold without prominent disclosures…[furthermore,] HIPPA [laws] doesn’t apply to data you put into an app, even a health-related one”. This is more reason why users ought to scrutinize how their data is being managed via apps used.
Another reason why users need to take a second look at privacy policies is that – for example – “In a scenario where Roe is overturned, your digital breadcrumbs—including the kind that come from period trackers—could be used against you in states where laws criminalize aiding in or undergoing an abortion, say legal experts”. Nguyen and James cite a research director at University of Houston’s Health Law and Policy Institute, Leah Flower who states that “[menstrual data] has been relevant to the government before, in investigations and restrictions,” who also gave the example of a potential precedent that may have been set by the “2019 hearing where Missouri’s state health department admitted to keeping a spreadsheet of Planned Parenthood abortion patients, which included the dates of their last menstrual period”.
The writers provide tips on how users can check/review if an app can share their data by reviewing the privacy policy to understand the data management practices. Users can check whether the app “Encrypts their data – “that is, scrambled into an incoherent string of code when it’s traveling to the server or stored on the server—to prevent hackers from accessing it”. It also states that “end-to-end encryption, however, is an even more secure form of protection”, because even the app company cannot view your data, and have nothing “useful” to share with 3rd party marketers”. Also, users can look into whether the app can sell or share the data and if the company will notify the user if the government requests the user’s data/information. Lastly, the article also makes mentions that “any apps that ask for your location but don’t provide any visible use for the information, as it can be used in ad targeting”. In addition to all these tips, the article provides a comparison of various menstrual tracking apps and how data is managed on the app including Flo, App Health, Clue, Fitbit, Glow, and Natural Cycles.

Responses:

RE: How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade Is Overturned

Hello Barbara. This is definitely a hot topic (Roe v Wade) as the court’s opinion in the dobbs v jackson women’s health organization opinion is set to come out in a just a few days. One of my interests in my studies of law is the area of data and it legal limits. What is most interesting is how data is being accepted into the court room (allowed as evidence in trials). As you may know there are many rules of evidence (both criminally and civilly). It will be interesting to see if prosecution, in states where abortions may become stricter or outright illegal, will seek after this type of data (period tracking).
What is interesting is how data analytics and IS are influencing prosecutioners across the country to go after and target various different cybercrimes (like child pornography). More to your point, I think companies will find it a major struggle in sharing data with governmental agencies and have remaining customers. Google and Apple are great examples of this as they do not willingly share information with governmental agencies, even if it may be helpful.

RE: How Period-Tracker Apps Treat Your Data, and What That Means if Roe v. Wade Is Overturned

Barbara, Thanks for bringing up a very relevant point of view.
Period-tracking apps are often not covered under the Health Insurance Portability and Accountability Act, or HIPAA, though if the company is billing for health care services, it can be. Still, HIPAA doesn’t prevent the company from sharing de-identified data. If the app is free — and the company is monetizing the data — then “you are the product” and HIPAA does not apply.
A 2019 study published in the BMJ [1] found that 79% of health apps available through the Google Play store regularly shared user data and were “far from transparent.”
When it comes to marketing, a pregnant person’s data is particularly of high value and can be hard to hide from the barrage of cookies and bots. Some period-tracking apps, which often ask for health information besides menstrual cycle details, take part in the broader internet data economy, too.
“The data can be sold to third parties, such as big tech companies; or to insurance companies, where it could then be used to make targeting decisions, such as whether to sell you a life insurance policy, or how much your premium should be,” said Giulia De Togni, a health and artificial intelligence researcher at the University of Edinburgh in Scotland.
Flo Health, headquartered in London, settled with the Federal Trade Commission last year over allegations that the company, after promises of privacy, shared health data of users using its fertility-tracking app with outside data analytics companies, including Facebook and Google. [2]
In 2019, Ovia Health drew criticism for sharing data — though de-identified and aggregated — with employers, who could purchase the period- and pregnancy-tracking app as a health benefit for their workers. People using the employer-sponsored version must currently opt in for this kind of data-sharing.
Ovia’s roughly 10,000-word privacy policy details how the company may share or sell de-identified health data and uses tracking technologies for advertisements and analytics on its free, direct-to-consumer version.
For European residents, companies must comply with the stricter General Data Protection Regulation, which gives ownership of data to the consumer and requires consent before gathering and processing personal data. Consumers also have the right to have their online data erased.
Companies have the option of extending those rights to people living in the U.S. via their privacy policies and terms of services. If they do so, the FTC can then hold the companies accountable for those commitments, said Deven McGraw, Invitae’s head of data stewardship and the former deputy director for health information privacy at the Department of Health and Human Services Office for Civil Rights.
More technologies we use, more traceable our activities are, which might be good or bad, so Governments need to put checks and balances in place to protect the access and use of such data. I am seeing that movement has started as captured in “This is No Ovary-Action: Femtech Apps Need Stronger Regulations to Protect Data and Advance Public Health Goal”[3].

Discussion Thread 3:

Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.

This article was actually posted yesterday, and it explains how Microsoft plans to eliminate face analysis tools in push for “Responsible A.I”. The reason behind this is because for years, academics and activists have been raising concerns that facial analysis software, that claims to be able to identify a person’s age, gender, and emotional state, is unreliable, invasive, and shouldn’t be sold. Microsoft plans to remove those features from their A.I. service for analyzing, detecting, and recognizing faces this week. They are doing this as part of a push for tighter controls of A.I. products.
The article goes on to explain how a team at Microsoft had developed a “Responsible AI Standard” for the past two years. It is a 27-page document that sets requirements for A.I. systems that ensures they are not going to be harmful on society. Some of those requirements being ensuring that systems provide “valid solutions for the problems they are designed to solve” as well as “a similar quality of service for identified demographic groups.”
One large section of this article was about Microsoft heightened concerns about the emotion recognition tool. This tool labels a person’s expressions as either anger, disgust, contempt, fear, neutral, happiness, sadness, or surprise. Ms. Crampton, who has worked as a lawyer at Microsoft for 11 years and joined the ethical A.I. group in 2018 stated, “There’s a huge amount of cultural and geographic and individual variation in the way in which we express ourselves. That led to reliability concerns, along with the bigger questions of whether “facial expression is a reliable indicator of your internal emotional state.”
Overall, the company is taking concrete steps to live up to their A.I. principles and it is going to be a huge journey. The company hopes to be able to use their standard to try and contribute to the necessary discussion that needs to be had about the standards that technology companies should be held to.

Responses

RE: Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.

While it’s a bit disappointing that this “Responsible AI Standard” wasn’t in place before facial analysis software became available, it’s reassuring that it is being implemented now. Also, I had no idea that the face analysis tools could detect age, gender, or emotions. Honestly, I only thought those were used to scan eyes for unlocking your cell phone. I am not sure why these facial analysis tools would need to detect emotions.. That seems a bit strange to me.
It’s daunting to think about all the technology at our fingertips and what we can make it do. It’s even more daunting when you think about the fact that there’s not a lot of regulation to it as we don’t know where our limits lie. I do hope this move by Microsoft encourages others with such technology to follow suit, and maybe someday we will implement rules/laws before something potentially harmful is used for a significant amount of time.

RE: Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.

Hi Anna, great post and summary of Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I. article. At first read, I wondered why Microsoft wanted to remove Face Analysis. However, as I started to think about its application, it make sense and I agree with Ms. Crampton: “There’s a huge amount of cultural and geographic and individual variation in the way in which we express ourselves” If these types are products are used to determine expressions such as anger, disgust, contempt, fear, etc. then it leaves open an opportunity for abuse and a loophole for bad behavior. As Ashley stated, it would have been nice to have such standard prior to the release of software. This article also reminds me of China’s efforts to leverage facial recognition as part of their digital surveillance under the banner of security [1]

RE: Microsoft Plans to Eliminate Face Analysis Tools in Push for ‘Responsible A.I.

Great post and thanks for the summary as well. I too agree with Ashley and Angel’s comments. I think there is something to be admired about the discipline that Microsoft is taking in the AI practices – sometimes ideas come in waves and spaces: technology can be so complex when it comes to implementation so I am not surprised that this is coming up now – to echo Ashley’s comments, it almost seems backward.
That aside, this article did make me think more critically about regulation in America. It’s quite poor. Let’s compare ourselves to the UK and China. UK’s General Data Protection Regulation (GDPR) says “Everyone responsible for using personal data has to follow strict rules” [1]. The US doesn’t have anything near as robust as the UK does. Unfortunately, we are left vulnerable. A recent article in the WSJ compared the U.S. to China in terms of the Rules and Design Shaping the Metaverse. China has a big no-no culture toward violence and wars, and in order to not incite violence, shooting video games display blood in a green color [2]. This level of peculiarity that China provides and is attentive might be extreme for Americans, but I’d say we can definitely do better than where we are now. As the technology world continues to expand with new things, it’s really difficult to not see concrete plans being made in the house and the senate toward data protection for Americans.

Discussion Thread 4

The Hands-Off Tech Era Is Over

Today, it is obvious that governments cannot ignore technology any longer. In Texas, a controversial bill passed to limit the regulation of online expression by social media corporations, while in Europe, common chargers were created for portable equipment. More changes to the way tech businesses operate and how people use their products are likely as government regulators become more involved. This suggests that emerging technologies like facial recognition software and autonomous vehicles may take longer than expected to gain traction. The amount of government regulation being tried may easily become overwhelming, according to many tech supporters ( Ovide, S,2022). Many technologists think that more consideration and oversight will stifle innovation ( Ovide, S,2022). Tech companies often favored minimal regulation by stating that they were bringing about change for the better, much like how television and radio did when those mediums were young.
Though the hands-off common thinking made it seem that those in control of technology should be left to do as they please, it probably played a role in creating this idea. As a result, it became more difficult for governments to step in when they became aware that social media was being used undemocratically, that driver-assistance technology had not yet been proven safe, and that Americans had no say in the takeover of our digital information. There is now a feeling of power among regulators. In order to regulate law enforcement’s use of face recognition technology, lawmakers have stepped in. More laws similar to those passed in Texas are likely to pass in the future to remove a few of the Internet CEOs who control the laws that govern free expression for billions of people. Apple and Google will be forced to reinvent the app economy as more nations do so. Once more, not all of this will be beneficial government action. However, there are increasing indications that those who develop technology also support increased government regulation, at least in theory.

Responses

RE: The Hands-Off Tech Era Is Over

Kayode,
First of all, this is a wonderful article of choice and very interesting topic. I always like to look at this from the various sides of the Hands-On vs. Hands-Off situations. I think it is especially important to look at Apple and how it has been able to completely revolutionize technology in the past. The company has done so by continuously controlling nearly every variable in use on the mobile phones, tablets, and computers. This has enabled them to remain secured, efficient, and working up to Apple’s standards. That efficiency in body of work does impact the user’s rights for a lot of things. Such as the app economy you suggested. There has to be a thorough review of any app published on the app store by Apple and other vendors. This is just where I wonder how much of a hit will the industry’s performance take if those privileges are given back to us.
Obviously in the overall grand scheme of things, the power being given back to the people is the most important part. Giving us the ability to use our devices privately and have full control of our information is undoubtably the top priorty. But what will happen to Apple’s notoriety of efficient and locked down devices? If regulatory bodies decide to take more power from the organizations and return it to the people, I wonder if that power will be worth efficient and primarily convenient use of devices.

RE: The Hands-Off Tech Era Is Over

Hello Kayode, this is an interesting article and similar to the article Joshua shared. THe common denominator in both is the idea of regulation and how it may stiffle integration. The discussion of regulating the tech industry is is ongoing but ever-changing. I think the major hurdles are boiled down to two major principles: unpredictability and slow regulatory (law making) process.
1. I think the tech industry is very difficult to regulate because we don’t know what the ramifications of the new implementation of technology. For example, at the inception of social media like Facebook, it would have been nearly impossible to know that it would be a tool to influence countries’ elections.
2. Technology innovation moves fast. Law making, especially in the United States, is slow. Though regulatory bodies can make regulate without directly going through the legislative process, major regulations like the ones in Texas do. In some instances, by the time that regulation or law is put in place, the damage is already done or its obselete. It is iincredibly difficult to keep up the pace with technology advancements from a regulatory point of view.

RE: The Hands-Off Tech Era Is Over

The hands-off tech era was bound to end. So much of our lives are lived on the Internet and through our devices. We need to know that our digital lives are safe. For a period of time I think there was a knowledge gap between legislators and online users. To say it simpler there was a knowledge gap between older and younger generations. Something that I believe also contributed to this problem is that tech grew so fast, and is difficult to monitor. It gets difficult to make a rule when the Internet and technology is innovating so quickly.
With innovations be made so fast, laws do need to catch up. With the speed that laws tend to be made catching up may be harder than it sounds. These laws and protections cannot be made overnight, but instead a commitment to improving the quality and safety of online platforms and uses can be made. With dedication from law makers the Internet can become a better place for everyone. It can become a more trustworthy outlet overall.

RE: The Hands-Off Tech Era Is Over

Great article, as there is a lot to deal with in this subject. I truly believe that much of tech should be hands-off when it comes to innovation and creation, but that the government shouldn’t regulate nor impose or use it for their own nefarious reasons or advantages that may violate our own privacy and such. Not only will it stifle the business or competition, but put a trust hamper on how the government is dealing with tech that surveils or gathers info on us. If any example has shown that tech has grown without much government involvement, is to look at the likes of SpaceX and their starship prototype. Though however, because of now the FAA realized they require more oversight and concern of their build-out, their processes have slowed Musk’s timeline (which the prototype should have flown a couple of months, not a year).
I for one, don’t mind some regulations, as long as the article states, common-sense regulations, where certain guidelines or things shouldn’t impact or exceed limits. I believe that the governments do need a sense of pace and speed, and be able to follow or catch up with these companies if they need to impose or stipulate a regulation, rather than be stagnant and wasting our tax dollars. Things like getting internet to rural places are such examples. countries like Japan or South Korea allow their telecoms to quickly deploy and compete with each other, and still be able to profit and get internet to everyone (thus why they’ve had 1Gbps internet speeds for a while now) whereas, with the US, we’re barely just getting 100Mbps to everyone with 5G and satellite.
Yes, there must be laws for car autonomy, to make sure manufacturers follow rules and guidelines to build out their software and infrastructure. But also yes, there should be laws that our privacy should be one of the utmost importance, NOT one which the government can blur the gray lines, and impede and impose just for their own agenda. Technology should move at it’s own pace, and the government should be able to keep up, and only regulate those common sense needs to protect, and not abuse.

RE: The Hands-Off Tech Era Is Over

Hello,

The claims of privacy of companies must, at some point, be evaluated vis-a-vis the well-being and lives of the populace which are affected by it. It is simply the case that these social platforms are widely used, and the argument that these companies have no responsibility in curating them has become strained, if it had legitimacy to begin with. It seems to me that they must either layout a consistent code of conduct (social media code of conducts tend to be fairly vague at the minute), or declare these websites to be public spaces, in which case it is a matter of freedom of speech. It appears to me to be somewhat absurd that these social media spaces are not public spaces, as this is clearly how they are used in practice. There is a lot of difference between one’s private Discord server and the open space of Twitter, for example.

Format Specification
Discussion Thread 1
Title:

… Article Summary …

… Summary of Individual Postings…

… Final Opinion…