Well, this will be my final journal post for the year. First, I want to talk about the progress I've been making with my final project. As of now, I am almost done creating my script for the project. Once the script is finished, I will focus my attention on the presentation. For the presentation, I will be using OBS (Recording software) and record myself speaking while also presenting my slideshow. With a few finals heading my way next week, that shouldn't stop me from completing the project on time. I came up with the idea of using OBS because of Youtube's AI, they recommended me on how to setup OBS> For once, Youtube gave me an idea that I could (and am) using based on my privacy (in this case, my watch history). Companies obtaining our information and using AI's could also return us the favor and provide useful information back to us, but there's a lot more to it than that, which is why I created this project. Moving on from the project, I mentioned that this would be the last journal I would post for the school year. I'm thankful for both Mr. Bott and Mrs. Gergen for assigning us to post these journals. I'm not sure if this affected anyone else, but posting journals kept me on track with the things I was doing, and the feedback I received from the journals helped me move forward with my creativity and thinking. I would incorporate weekly journals for the upcoming years because it would help students who are passionate about their topic, move forward, and expand their thinking.
0 Comments
My presentation is going to be an argument based presentation, with the "So What", being the main priority throughout the entire presentation. I chose an argument based presentation because I want to inform the audience why we should reconsider our thoughts on privacy. During this pandemic, there have been numerous articles publised based on privacy alone. I want to use those articles to support my claim, while adding a counterclaim to my arguement. There have been some cases where the spread of our own privacy has led to a good cause. I've already started writing out the script for the presentation, and I'm looking forward to finishing that up by the beginning of next week. I'm also focusing on the news more and I'll be watching out for new articles, focusing on both COVID-19 and privacy.
For this week, I started planning a bit on how I'm going to approach my topic. For the final product, my goal is to make sure that the audience understands how we can approach privacy differently. I'm going to create a slide show and have a voice over on each slide, almost like having a presentation in real life, except that nobody can see me. Next week, I'm going to go into research mode and find out what's new information is being presentented. While I was reading the news this week, I stumbled upon an article talking about the stimulus check. Apparently, people are recieving fake checks in the mail. The Secret Service has mentioned a few ways to check if you have a legitamite check: Check the treasury seal, see if there's any bleeding ink, notice the watermark, and on the right side, make sure that it says “Economic Impact Payment President Donald J. Trump.” Sources also say that you may notice false advertising roaming around the interner mentioning the stimulus check. What does this say about our privacy? How will false advertising affect those who aren't familiar with it?
I've spent this week focusing on my essential question. We are living in a difficult time, where our privacy is something that can't be focused on right now. Health care workers are suggested, if not required to send reports of patients to check if they have COVID-19, teachers are required to switch to remote learning by the state, and workers have to either sit home unemployed or transition to working from home. How can we manage to protect our privacy if we are in a time where our privacy is being given up for security? This is the essential question that I'll be working with leading towards the final project. COVID-19 alone has drawn a lot of tension over privacy, and it's important that we figure out a way to keep our privacy in shape.
This week has been a little bizarre. While I wasn't able to submit my April SDA this month due to unexpected events, but I've taken some time to reflect on my essential question, As you may have noticed for the past month now, my topic on surveillance capitalism has shifted a bit. COVID-19 has caused a lot of controversy over privacy concerns. Zoom has been one of the major companies that's been in the news recently. From handling business conferences around the world to hosting classes online, Zoom has been a hot topic this year. Because of the sudden change in my topic, I had to reshape my essential question. With times like this, privacy concerns have shot through the roof. Now, medical records are being collected non-stop from hospitals, businesses such as Google and Zoom are providing free online services, for the time being, people are losing their jobs because of the pandemic, etc. What does this information tell us about the future? Along with these free services comes the risk. As you may have heard from my previous journal(s), Zoom has been selling information to Facebook, regardless if you have a Facebook account or not. This is a difficult time for all of us, but businesses are doing everything they can to profit from us. Going back to my one-pager, I'm planning on doing everything I can to submit it before Monday, so I don't fall behind.
Last week, I talked a little bit about the Ted talk assignment. I was Mr. Myrstad a message on Twitter letting him know that I want to interview him. At the moment, he hasn't responded to the message yet, but hopefully, he responds soon. If the interview doesn't go as planned, I'm going to do a video of some sort that focuses more on the current pandemic. Because of the pandemic, there has been a lot of news recently about new apps coming to app stores that revolve around COVID-19 and surprising enough, Google and Apple decided to step in. Recently, Google and Apple partnered up to create an app that uses Bluetooth technology to inform you about the virus, and whether or not you have come in contact with someone who has it. However knowing Apple & Google, there's been a lot of issues regarding privacy. Critics are saying that using this app could be abused heavily, with many types of devices being recorded/logged into their database, a variety of personal information being sent to third parties, etc. According to Marketwatch, Apple said that the information that would be collected would only be used for content tracking. Google did not make a statement so far, which also leaves concern for individuals. With a new year started, and problems arising due to the pandemic, privacy is at stake. My goal for this month's SDA is to figure out where surveillance capitalism is right now. Are these huge corporations profiting more than usual because of the pandemic? What new information are they collecting? What is our privacy looking like in the future?
Works Cited: Elisabeth Buchwald. "‘This could be abused.’ Privacy experts take cautious approach to Apple and Google’s coronavirus contact-tracing technology." MarketWatch, 16 Apr. 2020, https://www.marketwatch.com/story/this-could-be-abused-privacy-experts-take-cautious-approach-to-apple-and-googles-coronavirus-contact-tracing-technology-2020-04-16. Accessed 16 Apr. 2020. After completing the TED talk assignment, I brainstormed a lot of ideas that I could use for the symposium. I'm very happy that I watched Finn Myrstad's TED talk because he made a very true statement. Whenever we download an app, we never read the terms of service. We are unaware of what the app does in the background, and how's it impacting us privacy-wise. According to Business Insider, a study of 2,000 people was conducted and 91% of people consent to the terms of service without reading. For young adults, the percentage was higher. At a whopping 97%, young adults between the ages of 18-34 gave consent to the terms of service without reading them. Let's talk about Google Assistant. Google Assistant is an artificial intelligence-powered virtual assistant that is mainly used on smart devices and smart home devices. When people think of Google Assistant, they almost think of Alexa (Amazon's virtual assistant) or Siri (Apple's virtual assistant). The sole purpose of Google's assistant was to ease the lives of people by achieving simple tasks the user requests. If someone were to say "Ok Google...what's the weather outside?", the assistant would provide the temperature, along with the weather conditions right now. It seems simple, but it's not. Achieving simple tasks wasn't the only thing that Google's assistant was capable of, but it was a data mining machine as well. Let's use the weather example I wrote above. By simply saying "Ok Google" or "Hey Google," the device would trigger and start recording your conversation. Just by asking what the weather is, Google now has a copy of your voice. Not only that, but you just helped the AI. Google's assistant is a learning AI, meaning that the data it collects helps improve the AI's response to your task/question. If you were to talk to the assistant using slang, the assistant would normally respond because of the data collected. This goes with all virtual assistants. In order for the product to work successfully, it must constantly learn and accomplish the task the user requires. If you look on Google's website, the assistant can also do complex tasks, such as play music from Spotify, play iHeartRadio, or read aloud the latest news briefing. To those who are using these virtual assistant, did you know that this would happen? Virtual assistants whom you think are helping you are learning from you and making profit. As of right now, I'm trying to get a hold of Mr. Myrstad himself and checking if he's willing to do an interview with me. If all goes well, I'll be able to do this interview successfully. If that doesn't work, I'm thinking of making a video for my project talking about Ed law 2d and how our privacy has been affected because of COVID-19.
Sources: English, Trevor. "What Data are Voice Assistants Collecting and How to Protect Yourself." Interestingengineering.com, 30 Jan. 2020, https://interestingengineering.com/what-data-are-voice-assistants-collecting-and-how-to-protect-yourself. Accessed 9 Apr. 2020. Assistant. "Google Assistant, your own personal Google." Assistant, 3 Mar. 2020, https://assistant.google.com/. Accessed 9 Apr. 2020. The TED talk I chose to watch was "How tech companies deceive you into giving up your data and privacy" by Finn Lützow-Holm Myrstad, along with "The Danger of a Single Story" by Chimamanda Ngozi Adichie. While these two TED talks are about different topics, they do have one thing in common. As a society, we have denied literature, especially with technology being apart of our everyday lives. In Myrstad's presentation, he spoke about the security and privacy that we give up to things on the market. One fantastic example that he presents is Kayla. This toy can connect to the internet, and use speech recognition to answer your child's questions, something similar to Amazon's Alexa or Google's assistant. Instead of focusing on the object, we find it hard to take time and see what's going to happen in the future. What if Kayla was used to send predators to your house? We have no idea what these devices are using, and we should blame ourselves for using it. One thing that caught me while I was listening to Adichie was that humans are full of bias, and we consume lots of it. Adichie's roommate at her college was surprised how much she knew about western culture. Humans consume technology on a daily basis because it's a necessity. We assume that the technology we use reduces the stress in our lives, but at what cost? Myrstad states that we often find excuses not to read the terms of services because of how eager we need to use the service, or the terms of service are simply too long and boring to understand, and it merely wastes our time. Myrstad did something in which him and his team chose multiple apps and read over their terms of service. It took him and his team over 31 hours to thoroughly read the terms of service for each app. Imagine using Youtube, Facebook, Amazon Alexa in those 31 hours and not fully realizing what data we've given up. Adichie's speech on the idea of bias, along with Myrstad's speech on data and privacy, show us that we have put technology at the front of the line, instead of ourselves. We make these assumptions because of how superior and intelligent technology is when, in reality, whom are we helping? Something that I want to emphasize at the symposium is the "So what?". For my topic, the audience must understand what I'm trying to say. The amount of work that I've put into this research class has meant a lot to me, and the information I gathered in the past seven months should stick in the minds of others. Some techniques I saw both Adichie and Myrstad use were jokes and storytelling. My job during the symposium is to keep the audience engaged, and what other way to keep them engaged is by storytelling and cracking some jokes. With these two techniques, along with incorporating the "So what?", the audience will sure enough enjoy listening to my presentation and hopefully take something away from it.
March has passed and now, its April. With a lot of stuff going on during this unprecedented time, my job is to post a weekly journal for the month of April. Going back to my last post, I mentioned that I was going to do an interview. I decided that I would either interview an economist, who focuses primarily on data and business. However, if that doesn't work, I'll resort to an author who's more familiar with surveillance capitalism. With the current pandemic around, I know and understand that it'll be difficult to find someone to interview since they'll be working from home. Today, I was looking at Reddit (r/privacy) and I saw someone post something about Zoom. If you don't know what Zoom is, it's a conferencing service that businessmen use to talk/discuss with others. Zoom has become a lot familiar mainly because of the current pandemic. Professors/teachers are starting to use Zoom to stay in contact with their students and discuss new material so they won't fall behind. According to Vice, if you are using Zoom on your iOS device, Zoom will send data to Facebook, even if you don't have a Facebook account. The crazy thing is, they don't mention that anywhere in their privacy policy. If Zoom is that dangerous, especially for minors, who's to say that Google Meet does the same thing. I know that ed law 2D hasn't been full in effect yet, but could Zoom or Google be in trouble for practicing these illegal actions? There are a bunch of articles that have been posted on r/privacy and I suggest that you guys check it out.
Source: Cox, Joseph. “Zoom IOS App Sends Data to Facebook Even If You Don't Have a Facebook Account.” Vice, Vice Media, 26 Mar. 2020, www.vice.com/en_us/article/k7e599/zoom-ios-app-sends-data-to-facebook-even-if-you-dont-have-a-facebook-account. For the March SDA, I focused on surveillance capitalism and healthcare. I wasn't satisfied with my overall product. While I was able to successfully talk about Google and Facebook's take on health care, I didn't focus on the government's perspective, such as the HIPAA Privacy Rule. There was a lot of information that wasn't mentioned in my SDA that could've been included. In regards to time, I had a lot of time on my hands. I had all the research I needed for this assignment. However, I didn't manage enough time on my SDA itself. While having all the research I needed, I was cutting it close when creating the Prezi. Because of this, the outcome wasn't the best. These last few months are going to be exciting. We are a few months away from the symposium, which means that I'll have to finalize the work I've done throughout the year and create something like never before seen to the audience. Before the SDA, we will be assigned to do an interview. Because of the current pandemic, finding a person to interview will be difficult, but I'm willing to do anything I can for this interview to happen. With this interview, I'm going to redeem myself and include the information that I failed to include in this months SDA and incorporate it in the next SDA.
|
|