Week 8 Learning Journal Post

 Part 1: Review Other Teams' final video projects

Team A: EduOtter CST349 Final Research Video Project

Link: https://www.youtube.com/watch?v=0DJxwiSEHjY

1) Coverage of Topic

The video is very good in explaining to the viewer what a DeepFake is, how it's created, why they're dangerous, and some practical media-literacy advice to help identify them like where did this come from. They check for mouth and eye glitches and advise to verify elsewhere, go to someone you trust. 

A simple acknowledgment that there are methods to detect and possibly prevent them by using watermarking standards.

2) Clarity of Presentation

Clear and easy to understand. The opening reference to Tom Holland/Taylor Swift immediately provides an example of a DeepFake, and the "quiz" segment provides a way for viewers to practice their new media-literacy skills.

Adding on-screen bullets during the media-literacy checklist to provide additional support and emphasis to each step would be beneficial as well.

3) Quality of Research

All claims made about the creation, risk of, and detection of DeepFakes are consistent with the current mainstream thinking regarding the risks of Generative AI.

To further enhance the credibility of the research, include references to two to three relevant articles/sources published within the last few years related to elections, detection difficulties, etc. on screen while discussing these topics.

4) Production of Video

The pacing is good, the narrator is easy to hear and understand, and the examples used like dogs at the beach and face swaps are a great representation of the potential uses of a DeepFake.

Including subtle lower thirds with key terms such as "Deepfake," "Media Literacy" and consistently displaying captions would increase the accessibility and professionalism of the video.

5) Engagement/Interest Level

High level of engagement. The use of a quiz format and the example of a school being canceled due to misinformation will likely stick with the viewer.

Considering adding a 10-second summary slide ("Stop. Look. Verify. Share responsibly.") at the end of the video for viewers to take away and remember.

6) Evidence of Teamwork

Although the structure of the project appears to be a collaborative effort in writing and reviewing the script, including a small amount of time to credit the various team members involved in the project may help show that it was a group effort.

7) Fit for Targeted Audience

Excellent for a general or younger audience. The language is plain and the suggestions provided for identifying DeepFakes are concrete and don't require specialized knowledge. For a professional/technical audience, consider adding an additional sidebar on detecting DeepFakes (audio artifacts, provenance, watermarking) and the current state of policy related to DeepFakes.

Team B, Group 5: Talent Engine — "Can AI replace tutors and teachers?"

Link: https://www.youtube.com/watch?v=AMS4kjELDWg

(1) Coverage of topic

Very good: The team covered all of the necessary areas in their video -- History (Plato to LLM's), Adoption Stats, Capabilities (Personalization, Scale, Instant Feedback), Limits (Pedagogy, Hallucinations, Ethics/Bias/Privacy, Over-Reliance), Real World Tools (Jill Watson, Duolingo Max, Khanmigo), Human vs. AI Comparison, Collaborative Future. Complete.

(2) Clarity of presentation

Clear and Logical Flow with Frequent Signpost with brief Section Title Cards and a Summary Slide with 3 bullets per Major Section will help viewers retain information better.

(3) Quality of research

Based on Grounded Claims, Pew 2024 Usage of AI in Education, Known Deployments of AI Programs. To increase Rigor, On-Screen Citations with Source and Year should be Added When Quoting Statistic or Named Program and Limitation (Sample, Context) Should be Noted.

(4) Production of video

Smooth Pacing and Narration, Examples are Concrete - Include Consistent Lower Thirds for Key Terms (LLM, Hallucination, Pedagogy), Burn-In Captions for Accessibility, Equalize Audio Levels Across Segments.

(5) Engagement & Interest

Good Hook ("AI Teacher" Scenario), Relatable Example, Balanced Framing. Adding a Short Demo Clip or Side-by-Side "AI vs. Teacher" Comparison Would Increase Engagement Even More.

(6) Evidence of Teamwork

Structured Hand-Offs and Comprehensive Coverage Suggest Coordinated Roles (History, Data, Tools, Limits, Outlook). Closing Credits Card Listing Roles Would Make Collaboration Explicit.

(7) Fit of target audience

Well-Pitched for Tech-Savvy Learners and General Audiences. Minimal Jargon and Clear Explanations. For a Professional CS Audience, Addendum (Technical Appendix) Including Model Constraints, Evaluation Frameworks and Data Privacy Architectures Could Be Included.

Team C: Otterbotts: “Drone Delivery”

Link: https://www.youtube.com/watch?v=ppdupisIGI

1) Subject Coverage

The subject is covered as follows: History to Tech Stack like airframes, sensors, AI, geofencing, and FAA to Case Studies (Prime Air, Zipline, Wing, carriers) to Challenges (regulations, safety, privacy, jobs) to Future (Smart Cities, Swarms, Energy). The subject is very well-covered and the scope is complete.

2) Presentation Clarity

Narration is clearly structured, has smooth transitions, and could have a short Title Card for each segment of the presentation. Additionally, a One-Slide Recap after "Challenges" would help reinforce the major points from the previous segments.

3) Research Quality

Examples and claims are accurate, good distinction between quadcopters and fixed wing drone applications. To raise the level of research quality, cite the program, year, and location on-screen for each company referenced and provide a couple of metrics like range, payload, flight time for each company reference.

4) Production (Video)

Clean pace and voiceover; explanation is concrete. Adding Captions and consistently labeling Lower Thirds for terms (BVLOS, geofencing, LiDAR, UTM) would enhance the presentation. Adding a few labeled diagrams (sensor stack) and a comparison "Ground vs. Air Last-Mile" graphic would also help illustrate the advantage.

5) Engagement & Interest

 Adding a 15-20 second Route Simulation (Warehouse to Drop Point to Return) would increase visual engagement.

6) Evidence of Teamwork

The segmented approach suggests multiple team members had assigned roles. An End Card that lists Contributors and their Responsibilities would make it more apparent how the team worked together.

7) Fit to Audience

Fit for a tech-savvy class; little technical language and definitions were provided. A Brief Appendix that defines Autonomy Stack, Fail Safes, and Remote ID would be appreciated by CS audiences.


Part 2: Keep Up With Your Learning Journal

https://youtu.be/jWQoO-hBeQc

https://youtu.be/OdkkDc02Ktk?si=dihJHXxCwBZz8Zbx

This class has been a great opportunity to tie together a lot of threads for me. I’ve brought together technical communication, proper ethical reasoning, the discipline of research, and teamwork. On the content side, I’ve learned to discuss the framing of emerging technologies in terms of readily understood ethical lenses (stakeholders, risks, benefits, trade-offs), rather than just features, and to understand how minimization of prompts and changes in structure can lead to AI as a partner for writing, rather than a way to shortcut it. The presentations modules were also helpful, in getting me to storyboard the presentation ‘before the script,’ to properly design for audience (‘general’ vs. ‘technical’), and to use ‘sign-post language,’ so that the audience would always know where we stood in the progression of the narrative. I’ve also especially enjoyed the career components like resume/cover letter critiques, mock interview techniques. This has caused me to be much more intentional about the impact in resumes, concrete examples in the interviews, and proper constraints/assumptions in the technical answers.

Comments

Popular Posts