All in One Offer! | Access Unlimited Courses in any category starting at just $29. Offer Ends in:

Browse Library

  • Business Solutions
  • Become an Instructor
  • 0
    Shopping Cart

    Your Cart is empty. Keep shopping to find a course!

    Browse Courses
Get Unlimited Learning Access
$29
5 days left at this price!
30-Day Money-Back Guarantee

This plan includes

  • Instant access to 11,000+ online courses
  • Play & Pause Course Videos
  • HD Video Recorded Lectures
  • Learn on Mobile/PC/Tablet
  • Quizzes and Real Projects
  • Lifetime Course Certificate
  • Instructor Chat Support
  • Cancel Plan Anytime
Subscribe to Learnfly’s top courses
Get this course, plus 11,000+ of our top-rated courses for one year with Go Annually Plan.
$348 $244 a year Save 30%
5 days left at this price!
30-Day Money-Back Guarantee

This plan includes

  • Instant access to 11,000+ online courses
  • Play & Pause Course Videos
  • HD Video Recorded Lectures
  • Learn on Mobile/PC/Tablet
  • Quizzes and Real Projects
  • Lifetime Course Certificate
  • Instructor Chat Support
  • Cancel Plan Anytime
$29
$244
  • How algorithms influence your life and decisions.
  • How software algorithm deployed by organizations can be audited to find errors, misuse or bias
  • Frames in which editorial transparency can be achieved

   

Platform power, algorithm accountability and reporting, for Non-Technicals, Leaders, Managers, Freshers and Beginners.

In the course before this one, you have learnt so much about what algorithms are and how they work. In this course we will be discussing the algorithmic power behind many social media applications and so much more;

? How algorithms influence your life and decisions.

? How to call these algorithms to question.

? How to earn a living by auditing or questioning how and why these social media algorithms behave the way they do, or make decisions for the society that could be dangerous down the line.

This course is a single installation of a broader advanced 3-part course called ‘Automated Content Production and News Algorithms’. It consists of 4 modules and 20 learning videos including the transcript and practice questions.

This Part of the course covers;

The algorithmic power behind Tiktok

The algorithmic power behind Twitter

The algorithmic power behind Triller

The algorithmic power behind Youtube

The algorithmic power behind Facebook.

Algorithmic accountability reporting methods

Techniques to critique, investigate and report an algorithm

Understanding how to validate and document the occurrence of errors in technology or software applications

Editorial responsibility and algorithmic transparency

Frames in which editorial transparency can be achieved

How software algorithm deployed by organizations can be audited to find errors, misuse or bias

Case Study 1: How minority neighborhoods pay higher car insurance premiums than the privileged with the same risk factor

Case Study 2: A software meant to predict future criminals is biased against blacks

Analysis of the algorithm used by prisons to create risk scores

Angles for investigating algorithms

You will get 20 videos of learning content with transcript and practice questions

Besides the video materials, there are also transcripts of the videos, and practice questions to help guide and reinforce your learning.

Buy this course already and get started.

 

  • Willingness to learn and practice learning
  • Non-Technicals
  • Leaders
  • Managers
  • Freshers
View More...
  • Section 1 : Platform power 7 Lectures 00:08:59

    • Lecture 1 :
    • Hello and welcome to module 7 of the Automated Content Production course called – Platform power and algorithm. The module will help you understand the power of algorithms behind platforms like Tiktok, Triller, Twitter, Facebook, YouTube and how audiovisual media and short form expressions is driving engagement for newsrooms to gain a whole new set of audience. Also as a reminder, after watching the videos, please proceed to read the three documents that are added to this module. After this, please ensure you take the quiz.
    • Lecture 2 :
    • Triller
    • Triller Triller is an “AI-powered” video-editing app. Its auto-editing algorithm and facial analysis makes it edit like a talented human editor. Users can film multiple clips of themselves and using artificial intelligence, the app will automatically compile together the best clips to create a music video. Triller organizes its Discover pages with Leaderboards, genres categorization, promoted campaigns, and top videos based on the user’s viewing history. Triller has reported 500% month-over-month growth and currently has over 26.5 million active monthly users, competing with TikTok in the U.S, and more than 75 million total users. Its unique algorithm is the key driver of its growth.
    • Lecture 3 :
    • Facebook
    • Facebook Mark Zuckerberg in 2018 announced that now the news feed algorithm will prioritize posts that trigger discussions and meaningful interactions. This change was to introduce forefront quality, bringing in organic content. The algorithm prioritized posts that collected high-quality interactions (comments, reactions, responses to comments, sharing in the messenger). By prioritizing posts from family and friends over public content from Pages, Facebook believes that a person-to-person connection is more valuable than a person-to-page connection. Thus, that content from friends and family tends to spark more “active” engagement from users. Facebook also announced a tool aimed at giving users more transparency and control over what is shown in the news feed. The button “Why do I see this post?” does exactly what it promises: it helps the user understand why the algorithm shows this particular post in the news feed. It also allows users to “tell” the algorithm directly what is important to them and what is annoying. Users can tell Facebook that they want to see fewer posts from a specific person or see more posts from a specific business page.
    • Lecture 4 :
    • Twitter
    • Lecture 5 :
    • YouTube
    • Youtube: YouTube is different from other social media platforms because viewers watch longer videos. One of the most important metrics that the algorithm looks at is Watch Time. Longer videos translate to higher watch time, which tells the YouTube algorithm that a viewer might stay on the platform for a longer time. Couple this with high engagement – content which users watch till the end, like, comment or share – it’s a perfect recipe for greater visibility on YouTube. The algorithms behind all of these platforms serve so many different purposes. While many are for entertainment, the power of twitter for example has been leveraged for not only news but for driving advocacy, influencing elections and challenging policies. In Nigeria, the government has been overwhelmed by this power and how much its citizens use twitter to callout ills, criticize government such that they have been looking for ways to introduce a ‘Social Media Bill’ into as a way to stifle it. In the next video we will be discussing Algorithmic accountability reporting methods. That will be all for now.
    • Lecture 6 :
    • TikTok
    • TikTok This is a 15-second video sharing platform created by a Chinese company ByteDance. With 1.4 billion downloads, the app as at today has a lot of influence on the U.S. music industry as music outfits are using it to explore user preferences and consumption habit. But Tiktok is not just for music, news organizations like the Washington Post, NBC News, and ESPN had taken great strides in creating leveraging it for journalism in unique ways, introducing a new set of audience especially younger people their content, bridging the gap between the world and the profession. How does its’ algorithm work? When you launch the app, you immediately see a ‘For You’ page, a page generated through algorithms which take into account the videos you have previously seen, liked or shared. It’s not like Facebook or Twitter which shows content based on people you follow. The algorithm acts on views, likes, comments, shares, and downloads a video receives. It shows more people your video by the velocity of the engagement it receives, hence if it suddenly receives 30% more likes one day, then it sends your video to the ‘For You Page’ of many users, sparking a sudden rise in engagement. So far, the aforementioned platforms have used Tiktok to introduce more people to the beauty of journalism through humor, dance competitions, the daily life of a journalist, gathering millions of engagements in return. Washington Post has garnered over 542,700 followers and 23.4 million likes, NBC news – 146,500 followers with 2.3 million likes, and ESPN with 9.3 million followers and 403.7 million likes.
    • Lecture 7 :
    • Quiz
    • Quiz
  • Section 2 : Algorithm Accountability 5 Lectures 00:00:35

    • Lecture 1 :
    • Algorithmic Accountability Reporting Methods
    • Hello and welcome to module 8 of the Automated Content Production and Algorithms course where we will discuss – Algorithmic Accountability Reporting Methods. In this module we will discuss; I. Algorithmic Accountability Reporting Methods – Overview II. Techniques to critique and investigate an algorithm Also as a reminder, after watching the videos, please proceed to read the three documents that are added to this syllabus. After this, please ensure you take the quiz.
    • Lecture 2 :
    • Algorithmic Accountability Reporting Methods- Overview
    • Algorithmic Accountability Reporting Methods – Overview The focus in the previous videos has been how algorithms can help news organizations optimize and get the best out of their content; how algorithms are helping us make decisions without we knowing it, and also how companies are using algorithms to decide who gets what benefit and who does not. When you need a car or health insurance, an algorithm can decide if you will get one or not. An algorithm can suggest that you pay four times what you are meant to pay for car insurance just by the fact that you live in a run-down neighborhood, even if your neighborhood is actually safe. An algorithm can assume you want to commit health insurance fraud by considering that you are applying for a claim after taking the policy two or three months prior. Even if your life was on the line. Algorithms affect our lives and the society in many ways. In this discussion, we will discuss techniques journalists use to take algorithms apart, investigate how they were built, critique them, question their design and development, question their performance - the input and see if the output is the desired one.
    • Lecture 3 :
    • Techniques to Critique and Investigting an Algorithm
    • Lecture 4 :
    • Quiz
    • Quiz
    • Lecture 5 :
    • Techniques to Critique and Investigting an Algorithm
  • Section 3 : Editorial Responsibility and Algorithmic Transparency 5 Lectures 00:02:06

    • Lecture 1 :
    • Editorial Responsibility and Algorithmic Transparency
    • Hello and welcome to module 9 of the Automated Content Production and Algorithms course. In this module we will talk about –Editorial responsibility & algorithmic transparency. In this module we will discuss; I. An overview of Editorial responsibility & algorithmic transparency Also as a reminder, after watching the videos, please proceed to read the three documents that are added to this module. After this, please ensure you take the quiz.
    • Lecture 2 :
    • Editorial Responsibility and Algorithmic Transparency- An Overview
    • Editorial responsibility & algorithmic transparency – An Overview Thisrefers to the code of conduct which describes the responsibilities ofpublishers, editors and journalists towards the public. The code includes basicfundamentals such as the care that must be taken to “avoid publishing inaccurate,misleading or distorted information, including pictures”. (Ansgar Koene et al). Algorithmic transparency is openness about the purpose, structure and underlying actions of the algorithms used to search for, process and deliver information. News organizations and Social media platforms alike daily harness user data; behavioral patterns, sentiments, conversations and more to understand user preferences so as to optimize their content to suit the users. Once upon a time there were absolutely no transparency in how the algorithms behind these platforms made the decisions of what to show or not to show to users.
    • Lecture 3 :
    • How Publishers are Taking Responsibility
    • In the past couple of years, social media platforms like Facebook and Twitter began working towards transparency by giving users power to decide and tell the algorithms what they don’t want to see, or want to see. Facebook deployed a “why am I seeing this” feature. Twitter deployed a similar feature with “I’m not interested in this”, a function that gave the user more ability to tell the algorithm the type of content he/she is not interested in. Once upon a time, social media platform owners argued that they did not publish content, but rather reshaped what people see, thus they are not responsible for downstream circumstances that may be created by decisions their algorithms make. A common method used to provide transparency and ensure algorithmic accountability is the use of third party audits. This approach is known as qualified transparency. After complaints were made to the Federal Trade Commission (FTC) about the search giant Google, for example, watch-dog algorithms created by FTC staffers found that Google’s search algorithms generally caused its own services to appear ahead of others in search results. To provide transparency, the criteria used in the evaluation, as well as the results, were publicly released and explained. Although the FTC decided Google's actions were not anti-competitive in nature, the negative publicity the investigation created inspired Google to make changes.
    • Lecture 4 :
    • Frames in which transparency can be achieved
    • Frames in which transparency can be achieved As stakeholders; pressure groups, audiences, public opinion and regulators continue to push for algorithmic transparency, in Paper Challenges of a platform press: Algorithmic accountabilityand transparency in news media, it wassuggested the frames in which transparency can be achieved include; 1. Public responsibility 2. Professional responsibility 3. The Political frame Public responsibility:This develops out of a public need for something, which is why thefirst step at this point is to raise public awareness about the workings of algorithms. As mentioned above, it is contracted between press and society when both parties agree on the importance of certain issues. More research dedicated to the workings and effects of algorithms would add to a public debate about algorithms, a crucial step for creating anawareness for algorithmic accountability in media. Even if users were enlightened in the workings of algorithms, thereare no references yet of how users would react. Professional responsibility:Focusing on transparent journalism that shows users how content was produced andwhy it is published through a certain channel, as well as missing incentives for goodjournalism on platforms are the main themes in this part. Accountability originating in this frame is self-imposed by professionals working the field of journalism. The answer is a transparent process of journalism, integrity. Show your audience, how we operate, why we make decisions we make, where our reports are based on, etc., and be as transparent as possible about it. Political responsibility; Capitalist interests stand in the way of transparency. Platforms do not want to share their trade secrets because their actions are steered by shareholder interests. Company owners have to explain their actions to shareholders why are theydoing something, and they are constantly afraid of being sued if they decrease thecompany’s value by anything they reveal. However, state induced measures can aid pressure the current power structures to lean towards transparency, this could be through introduction of policies and further regulations. In the notation of responsibility that we align to, what is of utmost importance here is that a responsible editorial approach should be taken as a shared and collective multi-stakeholder responsibility. In the next module which is the last, we will discuss Algorithms Beat – Angles and Methods for Investigation.
    • Lecture 5 :
    • Quiz
    • Quiz
  • Section 4 : Algorithm Beats 7 Lectures 00:08:14

    • Lecture 1 :
    • Introduction
    • Hello and welcome to module 10 of the Automated Content Production and Algorithms course called – The Algorithms beat. In this module we will discuss; I. An overview of the Algorithms beat II. Algorithm beat case 1: How Minority Neighborhoods Pay Higher Car Insurance Premiums Than White Areas with the Same Risk III. Algorithm beat case 2: Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. IV. Driving forces for algorithmic accountability stories Also as a reminder, after watching the videos, please proceed to read the three documents that are added to this syllabus. After this, please ensure you take the quiz.
    • Lecture 2 :
    • An Overview of Algorithm Beats
    • An overview of the Algorithms beat In module 9 we discussed Editorial responsibility & algorithmic transparency; the code of conduct which describes the responsibilities of publishers, editors and journalists towards the public. Here we explore how stories can be found in algorithms, the different angles and methods of investigating and finding what’s newsworthy about algorithms. Algorithms are more in our lives than we think; they exist on ecommerce platforms when we want to buy an item, in video-on-demand platforms like Netflix, at the train station or airports. Banks or loan companies use algorithms to determine who is credit worthy, insurance companies use to prevent or investigate fraud, prisons use it to classify and rate inmates, search engines use it to define what information we see. Algorithmic beat looks at algorithms deployed by different systems or organizations, audits, critiques it, to finds errors, bad decisions, bias or misuse, with an end point which is reporting it. It explores how these discoveries are affecting or can potentially harm individuals or the society at large.
    • Lecture 3 :
    • Algorithm Beats Case 1
    • Algorithm beat case 1: How Minority Neighborhoods Pay Higher Car Insurance Premiums Than White Areas with the Same Risk Here we look a story originally published by Propublica on their investigation / analysis of premiums and payouts in California, Illinois, Texas and Missouri which shows that some major insurers charge minority neighborhoods as much as 30 percent more than other areas with similar accident costs. Otis Nash works six days a week at two jobs, as a security guard and a pest control technician, but still struggles to make the $190.69 monthly Geico car insurance payment for his 2012 Honda Civic LX. Across town, Ryan Hedges has a similar insurance policy with Geico. Hedges, who is a 34-year-old advertising executive, pays only $54.67 a month to insure his 2015 Audi Q5 Quattro sports utility vehicle. Nash pays almost four times as much as Hedges even though his run-down neighborhood, East Garfield Park, with its vacant lots and high crime rate, is actually safer from an auto insurance perspective than Hedges’ fancier Lake View neighborhood near Wrigley Field. On average, from 2012 through 2014, Illinois insurers paid out 20 percent less for bodily injury and property damage claims in Nash’s predominantly minority zip code than in Hedges’ largely white one, according to data collected by the state’s insurance commission. But Nash pays 51 percent more for that portion of his coverage than Hedges does. For decades, auto insurers have been observed to charge higher average premiums to drivers living in predominantly minority urban neighborhoods than to drivers with similar safety records living in majority white neighborhoods. Insurers have long defended their pricing by saying that the risk of accidents is greater in those neighborhoods, even for motorists who have never had one. But a first-of-its-kind analysis by ProPublica and Consumer Reports, which examined auto insurance premiums and payouts in California, Illinois, Texas and Missouri, has found that many of the disparities in auto insurance prices between minority and white neighborhoods are wider than differences in risk can explain. In some cases, insurers such as Allstate, Geico and Liberty Mutual were charging premiums that were on average 30 percent higher in zip codes where most residents are minorities than in whiter neighborhoods with similar accident costs. Propublica’s findings show that; Despite laws in almost every state banning discriminatory rate-setting, some minority neighborhoods pay higher auto insurance premiums than do white areas with similar payouts on claims. This disparity may amount to a subtler form of redlining, a term that traditionally refers to denial of services or products to minority areas. And, since minorities tend to lag behind whites in income, they may be hard-pressed to afford the higher payments. It isn’t completely clear why some major auto insurers persist in treating minority neighborhoods differently. It may in part be a vestige of longstanding practices dating back to an era when American businesses routinely discriminated against non-white customers. It’s also possible that the proprietary algorithms used by insurers may inadvertently favor white over minority neighborhoods. This investigation marks the first use of industry payout data to measure racial disparities in car insurance premiums across states. Over 100,000 premiums were analyzed, and the analysis was limited to the four states that release the type of data needed to compare insurance payouts by geography, but raises the prospect that many minority neighborhoods across the country may be paying too much for auto insurance, or white neighborhoods, too little.
    • Lecture 4 :
    • Algorithm Beats Case 2
    • Algorithm beat case 2: Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. This story was originally published by ProPublica. On a spring afternoon in 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid’s blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs. Just as the 18-year-old girls were realizing they were too big for the tiny conveyances — which belonged to a 6-year-old boy — a woman came running after them saying, “That’s my kid’s stuff.” Borden and her friend immediately dropped the bike and scooter and walked away. But it was too late — a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of $80. Compare their crime with a similar one: The previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store. Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile. Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden — who is black — was rated a high risk. Prater — who is white — was rated a low risk. Two years later, we know the computer algorithm got it exactly backward. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars’ worth of electronics. Another original publication by Propublica, detailed insights on how they analyzed COMPAS (Correctional Offender Management Profiling for Alternative Sanctions), a software by of a for-profit company, Northpointe, whose algorithm is used to create risk scores. Propublica’s analysis found that black defendants were far more likely than white defendants to be incorrectly judged to be at a higher risk of recidivism, while white defendants were more likely than black defendants to be incorrectly flagged as low risk. Propublica looked at more than 10,000 criminal defendants in Broward County, Florida, and compared their predicted recidivism rates with the rate that actually occurred over a two-year period. When most defendants are booked in jail, they respond to a COMPAS questionnaire. Their answers are fed into the COMPAS software to generate several scores including predictions of “Risk of Recidivism” and “Risk of Violent Recidivism.”
    • Lecture 5 :
    • Algorithm Beats Case 3
    • Lecture 6 :
    • Angles for Investigating Algorithms
    • In considering angles on algorithms, Nicholas Diakopoulos in the Data Journalism Handbook identified four different driving forces for algorithmic accountability stories: Discrimination and Unfairness Uncovering discrimination and unfairness is a common theme in algorithmic accountability reporting. The story from ProPublica that led this chapter is a striking example of how an algorithm can lead to systematic disparities in the treatment of different groups of people. Northpoint, the company that designed the risk assessment scores, argued the scores were equally accurate across races and were therefore fair. But their definition of fairness failed to take into account the disproportionate volume of mistakes that affected black people. Stories of discrimination and unfairness hinge on the definition of fairness applied, which may reflect different political suppositions Errors and Mistakes Algorithms can also be newsworthy when they make specific errors or mistakes in their classification, prediction, or filtering decisions. Consider the case of platforms like Facebook and Google which use algorithmic filters to reduce exposure to harmful content like hate speech, violence, and pornography. This can be important for the protection of specific vulnerable populations, like children, especially in products like Google’s YouTube Kids which are explicitly marketed as safe for children. Errors in the filtering algorithm for the app are newsworthy because they mean that sometimes children encounter inappropriate or violent content. Classically, algorithms make two types of mistakes: false positives and false negatives. In the YouTube Kids scenario, a false positive would be a video mistakenly classified as inappropriate when actually it’s totally fine for kids. A false negative is a video classified as appropriate when it’s really not something you want kinds watching. Legal and Social Norm Predictive algorithms can sometimes test the boundaries of established legal or social norms, leading to other opportunities and angles for coverage. Consider for a moment the possibility of algorithmic defamation. Defamation is defined as “a false statement of fact that exposes a person to hatred, ridicule or contempt, lowers him in the esteem of his peers, causes him to be shunned, or injures him in his business or trade”. Over the last several years there have been numerous stories, and legal battles, over individuals who feel they’ve been defamed by Google’s autocomplete algorithm. An autocompletion can link an individual’s or company’s name to everything from crime and fraud to bankruptcy or sexual conduct, which can then have consequences on reputation. Human Misuse Algorithmic decisions are often embedded in larger decision-making processes that involve people and algorithms, so-called sociotechnical systems13. If algorithms are misused by the people in the sociotechnical ensemble this may also be newsworthy. The designers of algorithms can sometimes anticipate and articulate guidelines for a reasonable set of use contexts for a system, and so if people ignore these in practice it can lead to a story of negligence or misuse. The risk assessment story from ProPublica provides a salient example. Northpointe had in fact created two versions and calibrations of the tool, one for men and one for women. Statistical models need to be trained on data reflective of the population where they will be used and gender is an important factor in recidivism prediction. Broward County was misusing the risk score designed and calibrated for men by using it for women as well. So that will be all for this module. Please ensure you answer the quiz and learning does not stop here, you can watch the videos and read the associated materials of all the courses again to build your knowledge further. Thank you.
    • Lecture 7 :
    • Quiz
    • Quiz
  • How do i access the course after purchase?

    It's simple. When you sign up, you'll immediately have unlimited viewing of thousands of expert courses, paths to guide your learning, tools to measure your skills and hands-on resources like exercise files. There’s no limit on what you can learn and you can cancel at any time.
  • Are these video based online self-learning courses?

    Yes. All of the courses comes with online video based lectures created by certified instructors. Instructors have crafted these courses with a blend of high quality interactive videos, lectures, quizzes & real world projects to give you an indepth knowledge about the topic.
  • Can i play & pause the course as per my convenience?

    Yes absolutely & thats one of the advantage of self-paced courses. You can anytime pause or resume the course & come back & forth from one lecture to another lecture, play the videos mulitple times & so on.
  • How do i contact the instructor for any doubts or questions?

    Most of these courses have general questions & answers already covered within the course lectures. However, if you need any further help from the instructor, you can use the inbuilt Chat with Instructor option to send a message to an instructor & they will reply you within 24 hours. You can ask as many questions as you want.
  • Do i need a pc to access the course or can i do it on mobile & tablet as well?

    Brilliant question? Isn't it? You can access the courses on any device like PC, Mobile, Tablet & even on a smart tv. For mobile & a tablet you can download the Learnfly android or an iOS app. If mobile app is not available in your country, you can access the course directly by visting our website, its fully mobile friendly.
  • Do i get any certificate for the courses?

    Yes. Once you complete any course on our platform along with provided assessments by the instructor, you will be eligble to get certificate of course completion.
  • For how long can i access my course on the platform?

    You require an active subscription to access courses on our platform. If your subscription is active, you can access any course on our platform with no restrictions.
  • Is there any free trial?

    Currently, we do not offer any free trial.
  • Can i cancel anytime?

    Yes, you can cancel your subscription at any time. Your subscription will auto-renew until you cancel, but why would you want to?

95577 Course Views

3 Courses

Blaise Aboh is a Citizen Data Analyst and consultant and also the founder of AI Envoy Robotics. He is also lead partner at Orodata, a Civic Technology Organization studying and leveraging data-driven science and artificial intelligence to extract knowledge and insights from data. Blaise has equipped over 5000 media professionals with tools and technologies to practice better data science and data journalism in the past 4 years.
View More...
  • Unmatched Variety and Value!
    Learnfly's monthly subscription offers unlimited access to a vast range of courses. Affordable pricing, compared to competitors, makes it the ultimate choice for continuous learning.
    Jessica M.

    4.7

    JM
  • Top-Notch Quality, Affordable Rates!
    High-quality courses with certified instructors make Learnfly stand out. The affordable pricing is a game-changer for those seeking premium education.
    Alex P.

    4.5

    AP
  • Certified Excellence Every Time!
    Learnfly's courses, taught by certified instructors, ensure top-notch learning experiences. The course completion certificates add significant value to one's skill set.
    Sarah R.

    4.3

    SR
  • Round-the-Clock Support!
    Learnfly goes the extra mile with 24/7 course support. Their dedication to helping students succeed is commendable.
    Ryan K.

    4.1

    RK
  • Learn Anywhere, Anytime!
    Whether on mobile, PC, or tablet, Learnfly's platform offers flexibility. Learning on the go has never been easier.
    Emily S.

    4.7

    ES
  • Job-Ready Skills!
    Learnfly's job-oriented courses equip learners with practical skills for the workplace. An investment in career growth!
    Jake M.

    4.2

    JM
  • Budget-Friendly Brilliance!
    Learnfly's pricing is a steal for the quality and variety of courses offered. Quality education without breaking the bank.
    Olivia T.

    4.5

    OT
  • Instructor Excellence Unleashed!
    Learn from the best with Learnfly's certified instructors. The platform ensures that knowledge is imparted by industry experts.
    Daniel L.

    4.0

    DL
  • Achievement Unlocked!
    Learnfly not only offers courses but also recognizes your efforts with course completion certificates. A sense of accomplishment with every course finished.
    Maya H.

    4.6

    MH
  • Learning Revolution!
    Learnfly's platform is a revolution in education. Access to unlimited courses at affordable rates is a game-changer.
    Ethan W.

    4.7

    EW
  • learn-nxtgen-hacking-with-technology

    Learn NxtGen Hacking with Technolog...

    By : Gopikrishna C

    Lectures 80 Beginner 8:29:27
  • tcp-ip-the-complete-course

    TCP/IP: The Complete Course

    By : Lazaro (Laz) Diaz

    Lectures 17 Beginner 1:52:18
  • voip-configuration-and-attacking-hacking

    VoIP Configuration and Attacking (H...

    By : Arpit Mittal

    Lectures 6 Beginner 0:10:40
  • practical-blockchain-smart-contracts-ethereum-solidity

    Practical Blockchain & Smart Contra...

    By : Abhilash Nelson

    Lectures 40 Beginner 4:56:0
  • complete-ethical-hacking-penetration-testing-for-web-apps

    Complete Ethical Hacking & Penetrat...

    By : Abhilash Nelson

    Lectures 30 Beginner 3:28:56
  • the-complete-xmpp-course-chat-server-setup-android-ios-apps

    The Complete XMPP Course: Chat Serv...

    By : Abhilash Nelson

    Lectures 10 Beginner 0:47:3

Students learning on Learnfly works with Fortune 500 companies around the globe.

Sign Up & Start Learning
By signing up, you agree to our Terms of Use and Privacy Policy
Reset Password
Enter your email address and we'll send you a link to reset your password.