Featured Image for the blog: How Netflix Moved Operations to the Cloud and Saw Revenue Boom: A Digital Transformation Case Study

How Netflix Moved Operations to the Cloud and Saw Revenue Boom: A Digital Transformation Case Study

Remember the time you had to request mail-order DVDs to catch the latest flicks while munching popcorn on your couch?

Me neither.

It’s strange to think that about a decade ago, streaming giant Netflix had a business model built around direct mail.

Request a movie, put a few in your queue for next time, and let the anticipation build as you wait for your first DVD to arrive on your doorstep.

Now, our instant gratification bells ring daily as we pour through episode after episode of new material. And, we can barely remember the (dark) time where we waited days for entertainment instead of having it literally at our fingertips.

The shift from mail-in orders to a cloud streaming service improved customer satisfaction and made Netflix billions.

The company’s move to the cloud came with a hike in customer loyalty and a brand that competitors still fight tooth and nail to beat in the market.

Netflix serves as the ultimate digital transformation case study.

They transformed their entire business model and charted unprecedented waters. Here’s how to use their model as inspiration for your contact center’s digital transformation.

How to move your operations to the cloud, Netflix style: A digital transformation case study.

21 years after they started renting DVDs, Netflix now sits at a valuation of almost $145 Billion .

They came to market as a disruptor of traditional video stores like Blockbuster and Family Video.

Netflix founders Reed Hastings and Marc Randolph wanted to bring customer-centricity to the video rental market. At the time, renting videos was inconvenient and costly, with customers often plagued by expensive late fees.

They created an entirely new way to watch movies and consume content. And as time went on and subscribers grew, they continued to shift to keep pace with new consumer demands.

In 2007 , they took their first step into the world of streaming video. They offered customers a streaming subscription in addition to the more traditional DVD rental service, giving customers the option to chart their own path.

Since then, they’ve seen exponential growth in subscribers and revenue. Let’s take a look at their trends over time. We’ll skip over the first few years of the company’s infancy and jump to the year the company went public.

Here’s how Netflix has grown since 2002.

A digital transformation case study: Charting how a move to the cloud boosted revenue and subscribers

That incredible growth trajectory, and willingness to change, made Netflix stock skyrocket by 6,230% in a 10-year period.

And, they did it all without crazy price hikes, keeping customers top-of-mind.

While Netflix has adjusted prices over the years, they strike a balance by adding more value and services for the dollar. In 2019 , the Basic plan increased by $1 a month (adding up to $12 annually). While the Standard and Premier plans rose by $2 per month, (adding up to $24 annually, for each plan).

Meanwhile, the company is putting some $15 billion towards creating new content binge-watchers will love.

After this price change, Netflix saw a slight blip in subscriber growth, with growth in Q2 coming in low. But, analysts don’t think for a second it’s the beginning of a downward trend. In fact, a similar event happened back in 2010 when Netflix moved to a pricing model that broke out streaming and video rentals. And they clearly rebounded.

When you put the numbers into perspective, you see this is the first dip in subscriber growth in nearly a decade. That’s pretty remarkable. And, revenue still increased for the quarter. It’s clear the value of the digital innovator’s services still outweighs the cost for most.

Plus, if you can post positive revenue numbers for over a decade and become a multi-billion-dollar company in about 20 years, you’re doing alright.

Here’s what Netflix did to reach these lofty heights. And, how you study the same tactics to lead your contact center through a successful digital transformation.

Stay true to your vision.

Netflix started out with the idea to make it easier and less expensive for people to watch movies.

A digital transformation case study for the books... i mean movies. It's one for the movies.

But they didn’t want to stay in the DVD game forever. They had the foresight to predict that consumer behaviors would continue to shift. And, they wanted to stay ahead of the competition.

Only, they didn’t sacrifice their vision when it came time for company-wide changes. Instead, they realigned their business strategies to fit their vision, even as consumers and trends shifted.

What you can do:

As you make digital shifts in your contact center and your company, keep your vision constant. While tons of other factors may orbit around you, your vision keeps you grounded.

Use your company vision to guide your decision-making. And, use data and trends to predict how your customer behavior will shift.

As you shift to keep pace with your customers’ needs, align your operations to your customer behaviors to realize your vision.

Reinvent the wheel if the old one doesn’t solve customer problems.

Netflix soared from seed idea to a $145-billion-dollar valuation in only 21 years. (Wow, they did that in less time than it took big tech vendors to break CSAT scores.)

And they didn’t get there by spinning up a new-and-improved version of Blockbuster.

Ted Sarandos, Head of Content at Netflix said when he came on board at the early stages of the company founder Reed Hastings used his vision to scale and innovate at Netflix.

“We never spent one minute trying to save the DVD business,” said Sarandos .

The company leaders didn’t stick to traditional best practices because they no longer worked for modern customers.

Instead of piggybacking off what other companies did, Netflix solved problems differently. And, they solved them better. The proof is in a bankrupt Blockbuster and dwindling Family Video stores.

Want to know what you’re missing when you only look at digital transformation best practices? Pop over to our article on the topic.

Tailor your path and contact center strategies to your specific business needs. Focus on listening and understanding your customers, with the help of better data and customer surveys .

Find out what’s causing your customers’ pain. See what common questions your customers have. Work with your sales team to find out why customers are fleeing competitors. Discover why they choose your products and services in the first place. Then, work with your contact center and company leaders to develop the methods to solve these pains.

Don’t get caught up in what your competition is doing. What they’re doing might work, but your actionable data and customer information can guide you to a way that works better.

If you’re going to be consumed by one thought, let it be this one: how might we better serve our customers?

Don’t force your customers down a single path.

In the early phases of Netflix, internet speeds weren’t built for streaming movies. People who tried to download and view movies online were only frustrated by the lengthy, often interrupted experience of watching a film online.

Netflix didn’t want to enter the streaming market until the right infrastructure was available to support a platform with high-quality and high-speed content. They didn’t want to taint their brand from day one, linking the Netflix name to all the baggage that came with poor streaming experiences.

At the same time, they were watching postage prices. The price of postage kept rising, and internet speeds were on the ups. By watching how the market and internet infrastructure changed, they identified the right moment to launch their first streaming service.

They tested their streaming service with lower-quality video, first. They wanted to gauge interest and customer experience without canceling their bread-and-butter DVD service.

Those who wanted access to the crisp DVD picture could still order movies to their doorstep. Others who wanted instant access could forgo the high-quality picture for convenience, instead.

Your contact center and customer experience will change. It has to. But as you make changes and shift your operations to the digital era, keep options open for your customers.

Just because chat and email are on the rise as popular customer service channels doesn’t mean every customer wants to use them. Use past data and communication history to learn more about your customers. Then, coach your agents to handle each interaction based on the customer’s preferences.

Bringing changes to your contact center has the potential to transform your customer experience for the better. But, without careful intention, it can also cause friction. Introduce changes to your customers slowly, and make sure your agents are always there to offer extra help through the process.

Use data and trends to personalize your customer experiences.

This one’s huge. It’s how Netflix keeps customers engaged with their platform, and how they coined the term binge-watching

As Netflix made changes in their operations, they watched their data like a hawk. They looked for trends on how people watched content, what kept them watching, and how personalization fueled content absorption. Then, they used an algorithm to serve up content tailored to their customers’ specific interests.

“Like a helpful video-store clerk, it recommended titles viewers might like based on others they’d seen.” – Twenty Years Ago, Netflix.com Launched. The Movie Business Has Never Been the Same , by Ashley Rodriguez for Quartz .

And, as their new cloud-based business let them scale globally, their data points multiplied.

Previously, Netflix could only mail DVDs to U.S. customers. Shipping DVDs overseas wouldn’t have been financially sustainable while keeping prices fair for all customers. Moving to an online business model allowed Netflix to target and reach new audiences without taking on the costs of shipping globally.

Doing this not only scaled their business, but it diversified their data and made their algorithm smarter. Enter, extreme personalization and binge-watching fever on a global scale.

Track and analyze data from your customer interactions. Create custom reports and dashboards to distill important findings from your data. Then, use the trends and patterns you find to personalize your customer service experiences.

From the way you send customer surveys to the tone your agents use, your interactions tell you what your customers want. Lean into your analytics for valuable insight into how to help your customers.

And, use the data to transform your contact center too. Customer data is a powerful tool to drive business change. If your metrics show customers aren’t happy, your company leaders want to know about it. And, they’ll want to fix it. There’s no better case for company transformation.

Netflix took risks to transform their business. But, there’s no bigger risk than stagnation. Staying the same doesn’t help you reach your contact center goals. Innovating and trying out your big ideas is what separates the leaders from the laggards.

Can your tech vendor survive in your digital transformation?

Learn how to choose vendors who make your transformation strategy possible.

Case Study: How Netflix uses Cloud for Innovation, Agility and Scalability

Suggested articles, top 5 predictions for the cloud in 2019, how to streamline your data archival process using the cloud, cloudsine showcases weborion to protect cloud-based websites and web applications at 10-11 april aws summit 2019, cloudsine @ div0 startup quarter – on 23 may 2019, cloud security seminar – on 4 sept 19, the future of digital workforce with intelligent automation seminar , 21 nov 19, case study: cloudsine accelerates centre for evidence and implementation (cei)’s cloud adoption journey, aws cloudgoat and mitigation strategies: part 1, aws cloudgoat and mitigation strategies: part 2, aws cloudgoat and mitigation strategies: part 3, aws cloudgoat and mitigation strategies: part 4, aws cloudgoat and mitigation strategies: part 5, weborion® launches javascript malware detection engine (jme), cloudsine | weborion® supports the launch of community-focused ai security quarter in div0 on 31 mar 2021, cloudsine is excited to partner with sginnovate new frontier event to build up deep tech community, product announcement: enhanced email alerts for weborion defacement monitor, cloudsine and weborion signs technology alliance partnership with new net technologies, sutd x cloudsine – artificial intelligence award, sginnovate’s powerx programme with cloudsine, statement on apache log4j2 remote code execution (rce) vulnerability on weborion products and customers – cve-2021-44228, the cybersecurity implications for website owners from the russia-ukraine conflict, how are hacktivists shaping the cybersecurity posture of nation-states in the russia-ukraine conflict, weborion® adds smart image hash (sih) feature to improve monitoring of compressed images, weborion® introduces ai nlp for web defacement monitoring, cloudsine | weborion® – technology alliance partnership with netrust pte ltd, the serverless model for the uninitiated, seamless integration with the weborion® api, the weborion® defacement monitor cloud saas is now available on aws marketplace, cloudsine exhibited in govware 2022, 18th to 20th october, dns – a brief summary of an easily overlooked system, enumerate, secure and detect changes in dns records, port scanning – exposing your network’s points of entry, anyone can enumerate your web server using port scanning tools, weborion® anti-defacement and web security stack is now available on indonesia’s lkpp e-katalog, weborion® anti-defacement dan web security stack kini sudah hadir di e-katalog lkpp indonesia, what’s new in pci-dss v4.0: payment page javascript monitoring, what’s new in pci-dss v4.0: http header tamper detection, magecart and card skimming detection, what’s new in pci-dss v4.0: supply chain inventory of software, what’s new in pci-dss v4.0: ssl cert monitoring, preventing web defacement: a technical manager’s guide to securing web applications.

“Planning without taking action is the slowest route to victory. Taking action without planning is the noise before defeat.” - Sun Tzu, The Art of War Introduction to cloud computing It is said that the world evolves at the speed of technological evolution. Organizations are constantly looking for new technologies...

“The best way to predict the future is to invent it.” - Peter Drucker, American economist and corporate philosopher Cloud computing, the advancement of computing over a network of servers, was a key driver of the tech industry in 2018. Mergers and acquisitions between large and small companies led to...

How to Streamline Your Data Archival Process using the Cloud Data archiving is the process of moving data that is no longer essential to a separate data store for long-term retention. Archived data consists of older data that might serve some importance to the organization, possibly for future reference or...

Event Date: 10-11 April 2019 Last Updated: 27 May 2019 The AWS Summit Singapore brings together the cloud computing community to connect, collaborate and learn about AWS. Cloudsine was recognized as an AWS Technology Partner and an AWS Consulting Partner last year. During the Summit, we showcased WebOrion™, an all-in-one...

Event Date: 23 May 2019 Last Updated: 27 May 2019 Startup Quarter - Secure your cloud, was a meetup organized by Division Zero (Div0) on 23rd May 2019 at ACE. Div0 an open, inclusive, and completely volunteer-driven cybersecurity community with a mission of promoting a vibrant cybersecurity community and safer...

Event Date: 4 Sept 2019 Last Updated: 10 Sept 2019 Cloudsine is glad to host our very own seminar on the theme of Cloud Security: Myths, New Security Concerns and Mitigations on 4 Sept 2019. Glad to engage and interact with an audience of Singapore and Indonesia customers, resellers and...

Event Date: 21 Nov 2019 Last Updated: 25 Nov 2019 Cloudsine had our first RPA seminar in collaboration with Automation Anywhere on the theme of the “Future of Digital Workforce with Intelligent Automation” on 21 November 2019. It was our pleasure to have Ehunt Siow and Sundarraj Subrammani from Automation...

Cloudsine Accelerates Centre for Evidence and Implementation (CEI)’s Cloud Adoption Journey by Providing Data Migration, a Customized File Portal that Integrates with AWS S3 and Cloud Data Security Assurance. The Centre for Evidence and Implementation (CEI)is a global team of research, policy and practice experts based in Australia, Singapore…

Part 1 of our own series of articles on CloudGoat and mitigation strategies. This is a step by step breakdown on how to interpret and think like an attacker and also how to go about mitigating the attacks.

Part 2 of our own series of articles on CloudGoat and mitigation strategies. This is a step by step breakdown on how to interpret and think like an attacker and also how to go about mitigating the attacks.

In this third part, we will explore privilege escalation using EC2 instance profile attachment to obtain full admin privileges on the AWS account and also exploiting SSRF on EC2’s metadata service to get credentials.

This is part 4 of the series on AWS Cloudgoat Scenarios and the mitigation strategies series where we explore and see how remote command injection on a web application can be used to compromise the AWS environment.

WebOrion® is pleased to announce the launch of our new Javascript Malware Detection Engine(JME). The JME adds to the powerful capabilities of our WebOrion® Monitor to detect defacements, malicious scripts and other website threats. Today, practically every website uses JavaScript. The power and flexibility of a scripting language embedded within...

AI Security Quarter of Div0 was officially launched on 31 Mar 2021 over virtual Zoom and attended by >50 cybersecurity and AI enthusiasts in Singapore. Cybersecurity and AI are both critically important technologies for the digital future. Attackers are using more automation and AI to help them probe and attack...

Cloudsine is excited to partner with SGInnovate at the New Frontier event on 10 Apr 2021 to help build up the deeptech community. The New Frontier event is organized by SGInnovate with Guest of Honour, Lawrence Wong (Minister for Education), to promote the growth of the deeptech ecosystem in Singapore...

Email alerts are the primary method that WebOrion Defacement Monitor uses to inform our customers about the changes to their websites. Through these email alerts, users are informed if their website becomes unreachable, or if any of WebOrion’s various engines are triggered during webpage monitoring. The email alerts are important...

Cloudsine, the parent company of WebOrion, is excited to announce the technology alliance partnership with New Net Technologies (NNT). NNT is a Cybersecurity and Compliance software company based in UK and is widely deployed in many Enterprise and Government Organizations globally. Cloudsine provides cloud consulting services and offers web defacement...

Cloudsine is honoured to partner with SUTD to sponsor the artificial intelligence award to nurture interests and identify talents in this area. Students from SUTD who are interested may enroll in 50.021 Artificial Intelligence module offered by the Information Systems Technology and Design (ISTD) pillar. In this course, students will...

"The PowerX Programme has given young Cybersecurity companies like us a boost to identify, recruit and train cyber talents that are critical for our growth." Matthias Chin, Founder and CEO of Cloudsine. The PowerX Cybersecurity and Software & Product Development programmes are SGInnovate’s 12-month programmes including structured training and industry...

There was a high severity vulnerability (CVE-2021-44228) impacting multiple versions of the Apache Log4j which was disclosed publicly on December 9, 2021. An attacker who can control log messages or log message parameters can execute arbitrary code loaded from LDAP servers when message lookup substitution is enabled. The vulnerability impacts...

Cyber Threat Activities from the Russia-Ukraine Cyberwar The Russian incursion into Ukraine has led to a conflict that involves both the physical and cyber domains, with hacking groups of differing allegiances launching cyberattacks on government, military, financial and telecommunication websites. Cybersecurity specialists worldwide have highlighted growing concerns that the intensifying...

The Resurgent Threat of Hacktivism As the Russia-Ukraine conflict intensifies, cyberwarfare continues to be waged between the two countries. Concerns remain that state-backed hacker groups may target organisations outside of Eastern Europe in retaliation for the global sanctions imposed on Russia, or as false-flag operations to further promote political narratives....

WebOrion® is glad to introduce a new feature into the existing Integrity Analytics engine – Smart Image Hash (SIH). SIH helps reduce false alerts regarding image changes by analysing them in a smarter way. Images can make websites look more attractive and have been widely adopted ever since the inclusion...

AI technologies have been widely applied to different fields, but have you ever heard of using AI technologies to monitor the defacement of webpages? WebOrion is glad to introduce a new engine to the WebOrion defacement monitoring platform – AI Natural Language Processing (NLP) Engine. This engine analyzes webpage-changes and...

Cloudsine, the parent company of WebOrion, is pleased to announce the technology alliance partnership with Netrust Pte Ltd. Netrust is an established company since 1997 and is Asia’s first Public Certification Authority (CA) andSingapore’s only commercial IMDA-accredited CA. Cloudsine provides cloud consulting services and offers web defacement detection and response...

“Serverless” is a buzzword that is thrown around especially in the cloud industry. For the inexperienced, it may seem intuitive - “server” and “less”. It does not mean having less servers, but it actually refers to lesser (or no) management of servers. Serverless services allow developers to build and run...

Discover what is the WebOrion® API and the benefits of integrating the API with various systems such as Content Management Systems like Wordpress, and in the coming days, SIEM and SOAR systems. Our simple and easy-to-follow demonstration will also show you how to seamlessly integrate the WebOrion® API with your...

For the past decade, Cloudsine has been working with Amazon Web Services (AWS) to serve the market. Cloudsine is a consulting and technology partner with AWS and has used cloud computing to build and run many secure applications to support enterprise and government customers across Asia Pacific countries including Singapore,...

Thanks to all who visited our booth at GovWare 2022, held on 18-20 October 2022, at Sands Expo and Convention Centre, Singapore. We sincerely hope that all who visited us were able to catch a glimpse of what we currently do in the cybersecurity space with WebOrion, as well as...

As a website owner, one would surely come across the Domain Name Service (DNS). DNS is an extremely critical system on the Internet, as it is a system that helps translate domain names (which are easily recognisable and remembered) into IP addresses. It is important for all website owners to...

In this video, we will be sharing with you why your DNS records are important. How an outsider can conduct DNS enumeration to determine the attack surface. What can you do to hide and secure your DNS records. What are some tools WebOrion provide that can detect changes to the...

In computer networking, ports are points of entry to your computer – virtual origins and/or destinations of network connections. Port number definition and standardisation is overseen by the Internet Assigned Numbers Authority (IANA). Based on the list maintained by IANA, there are three types of ports amongst the total number...

In this video, we will discuss how hackers can easily enumerate your web server and potentially find vulnerabilities that they can exploit. It is important to understand how these attacks work so that you can take steps to protect your server and your website. We will walk through the process...

Today, we are excited to announce WebOrion® Defacement Monitor and Restorer is listed as a partner of LKPP E-Katalog, https://e-katalog.lkpp.go.id/, for PEP Category, Software Security, and Antivirus License. This opens up a new channel for Indonesian public sector agencies to quickly start protecting and monitoring their websites from cyber attacks...

Hari ini, dengan gembira kami mengumumkan WebOrion® Defacement Monitor and Restorer terdaftar sebagai mitra LKPP E-Katalog, https://e-katalog.lkpp.go.id/, untuk Kategori Peralatan Elektronik dan Pendukungnya, Keamanan Perangkat Lunak, dan Lisensi Antivirus. Hal ini membuka saluran baru bagi lembaga sektor public dan Pemerintah di Indonesia untuk segera mulai melindungi dan memantau situs website...

PCI-DSS is a set of security standards designed to ensure that all companies that accept, process, store or transmit credit card information maintain a secure environment. This article is part of a series of articles under the “What’s New in PCI-DSS v4.0” series where we explore what has changed in...

What is Magecart? Magecart is a type of cybercriminal group that specializes in stealing credit card information from online stores (a.k.a card skimming). The group's attacks typically involve injecting malicious code into the checkout pages of e-commerce websites to steal payment card data from customers. The Magecart group is known...

While web defacements may not be the most prevalent cyber attack in recent years, the consequences of web defacements are real – reputations may be damaged, client-customer relationships may be broken, financial losses may occur, etc. Web defacements can come in various forms, visual or non-visual (script inclusions). Hackers may...

In this video, we'll be discussing the important topic of preventing web defacement - a type of cyber attack that involves unauthorized alteration of a website's content or appearance. As a technical manager or CTO, it's crucial to understand the methods and motivations behind web defacement attacks and take steps...

“Planning without taking action is the slowest route to victory. Taking action without planning is the noise before defeat.” – Sun Tzu, The Art of War

Introduction to cloud computing

It is said that the world evolves at the speed of technological evolution. Organizations are constantly looking for new technologies such as cloud computing to meet their goals strategically and to drive business value. This article will address how cloud computing can possibly help organizations adopt efficient technologies and also improve productivity.

Cloud computing refers to computing on a network of remote servers accessible over the web, in order to store, manage, and process data. It utilizes computing resources of cloud providers, such as their data centers, instead of having the organization build their own local infrastructure.

Regardless of whichever industry one’s company belongs to (finance, retail or real estate), it is always advisable to understand the technology that other corporations are adopting. This is to solidify a competitive advantage by examining the lessons learned and best practises developed along the way.  In order to understand this better, let us illustrate how Netflix utilized cloud services to reach its level of success today.

Amazon Web Services used by Netflix

The cloud is an enabling technology for AI to mine and analyze data for deeply embedded insights. Cloud computing contributed an innovative breakthrough of accelerators for AI software. An accelerator is a class of microprocessor or system that is designed to provide hardware acceleration for AI applications such as neural networks, computer vision and machine learning. Currently, it is rather difficult for on-premise hardware to match the processing power of the accelerator hardware residing in the worldwide data centers of cloud providers (Source: Gartner, The Google Guys). Furthermore, cloud providers possess one more advantage over non-cloud AI: their extensive global network of data centers are in the better position to process the massive amount of data being generated all over the world. This alone makes it substantially easier to train machine learning models and neural networks for data insights and pattern recognition.

Analyzing customer data creates customer insights for any organization. It helps management avoid making assumptions about customers, which may be misinterpreted by the customer as apathy. Data analytics and personalized customer assistance (PCA) features are actually the largest areas of innovation on the cloud for organizations, up till 2019 (Source: InsideBigData, Google Cloud, IBM).

What sort of cloud services aid in discovering data insights and building personalized assistance features for customers? For one, Netflix uses Amazon RDS and DynamoDB, which provide the structural organization that these are cloud services that helps to build, develop and deploy custom machine learning models for each organization based on its unique goals and work environments.

While deciding whether to produce “House of Cards”, whereby 26 episodes cost $100 million in production, Netflix decided that it was more intelligent to use data analytics to determine which fan bases its new drama should target. These data were captured on their database for analysis. Using machine learning, they were targeting its marketing appeal at the fans of the British House of Cards, as well as the long-time fans of actor Kevin Spacey and director David Fincher.

With cloud-based AI services, organizations can index their entire product/service catalogue based on each customer/user’s profile. Age, location, gender, and other profile data helped to determine which products should be ranked first for each individual customer. Customers with different likings and profile data would see a personalized set of recommended products specially curated for their viewing. These personalized services tend to make users feel important and valued by the enterprise, instead of just being a source of revenue and hence retains the organisation’s customer base.

Cloud agility refers to the rapid provisioning of computer-related resources. The Cloud environment can usually provide compute instances or storage in minutes. Before cloud providers took off with IaaS, one had to email infrastructure suppliers and wait for a few weeks before the supplier replied with the requested provisions. (Source: Netflix, Amazon Case Study on Netflix). The existing IaaS delivery is executed using the consoles of cloud providers, allowing a faster release of new features for users. The benefit of such services reduce the time taken to develop, test and deploy software applications.

Most successful companies share a common trait: they had people who started developing a product/service prototype way ahead of their peers. The reason for their success is rather obvious – the first-mover advantage. Cloud computing is a technology designed to help organizations obtain the first-mover advantage, as evident from their rich variety of service offerings.

How did Netflix utilize agility features of the cloud for the cloud migration of their operations? They rebuilt their app functions inside the native cloud development environments first, later including app development for business operations. The large, cumbersome Netflix service of 2008 was refactored into microservices and unstructured scalable databases.

Netflix’s cloud database usage followed a pay-as-you-use basis, which helped them save costs whenever they rolled out the AI based feature called top personalized recommendations. (the AI has to mine data from their database, and so the database has to be hosted properly and securely on Amazon’s cloud). This AI feature, top personalized recommendations, showed users niche titles that would not be available on traditional cable networks but were similar to content liked by the user. As such, users purchased these niche titles more, generating more revenue; Netflix no longer had to spend so much money on acquiring new content to sell to users. The costs saved were estimated to be $1 billion by Netflix’s Vice President and Chief Product Officer, in a research paper published by them.

With such a progressive implementation, the management became more strategic and informed about budget evaluations and approvals. The purchase of hardware and the progressive release schedule of the re-morphing Netflix became more streamlined day by day. Gradually, a large organization like Netflix was no longer constrained by physical compute-resources and grew to become the global Internet TV network everyone knows today.


Scalability refers to a software-based product or service which retains its intended function with no quality compromise when moved to an environment with more incoming customers. The user’s needs must be met no matter what changes and the response time should not get longer. The elaborations below highlight the relevance of cloud scalability to your organization.

By using services from cloud providers like AWS and Open Connect (for streaming), Netflix expanded its network of servers (both physical and virtual) from North America to the rest of the world, including areas like Europe and India.

Netflix is one example of an organization using the cloud. By running on AWS, it provided billions hours of service to customers around the globe. Users can order its products/services from almost anywhere in the world, using PCs, tablets, or mobile devices. 10,000 customer orders were processed every second during Netflix’s last peak demand season. This is a stark contrast from the few thousand DVD orders Netflix could handle in its early days before streaming and migrating to the cloud. Having 86 million customers worldwide who consume 150 million hours of content daily, this is rather strong evidence about how the cloud has powered Netflix’s scalability of business operations.

Cloud providers like AWS provide technologies such as container auto-scaling and application level load-balancing, to support the customer service that Netflix provides. Cloud providers possess the resources to handle the gigantic operational loads of their client organizations. The compute resources they provide are globally available, enabling customers from around the world to place orders literally anytime they prefer. Organizations that face a small home-country market no longer have to worry about global expansion.

Conclusion and Final thoughts

The most important action taken by enterprise organizations in 2018 was to engage a professional cloud vendor experienced in providing step-by-step solutions and enterprise-level cybersecurity.

Enterprise innovation is now centered upon (but not limited to) cloud-native machine learning models and data analytics. These technologies offer a pleasant side benefit, assisting organizations in managing their vast amount of customer and operations-related data. In the area of enterprise agility, cloud providers and third-party resellers have created intelligent software so as to help enterprises make important decisions faster than ever. In the area of scalability, the cloud has empowered various organizations to serve their users and customers around the world with better availability and response times. Selling to more customers beyond the home-country is now easier.

It is important for organizations to understand global industrial changes. The future of the cloud computing will continue to be several billion dollar industries, such as AI innovation, blockchain and cloud security (Source: Forbes). Hence, most organizations now find their boardroom discussions increasingly centered upon the topic of technology  in business strategy.

“Victorious warriors win first and then go to war, while defeated warriors go to war first and then seek to win”

― Sun Tzu, The Art of War


The Curie, 83 Science Park Drive,#02-01C, 118258, Singapore



  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Netflix’s Bold Disruptive Innovation

  • Adam Richardson

Every now and then, the business world presents us with a lab experiment that we can observe in realtime. Netflix’s announcement that it is splitting off its DVD-by-mail business from its streaming business is just such an experiment. The DVD business will now go by the name Qwikster, and the streaming business will stay under […]

Every now and then, the business world presents us with a lab experiment that we can observe in realtime. Netflix’s announcement that it is splitting off its DVD-by-mail business from its streaming business is just such an experiment. The DVD business will now go by the name Qwikster, and the streaming business will stay under the Netflix brand. It is Clayton Christensen ‘s innovator’s dilemma incarnate, and Netflix is very publicly trying to solve it. Like its 60% price increase did earlier this year, this move is understandably causing consternation amongst some customers. It’s a bold move, one that will cost them in the near term, but Netflix I’m sure has done the calculus and is looking at the endgame 5-10 years out, not 5-10 months.

  • Adam Richardson is a Group Product Manager at Financial Engines , and was formerly a strategy leader at innovation firm Frog Design. He is the author of Innovation X: Why a Company’s Toughest Problems are its Greatest Advantage . Follow him on Twitter . Opinions expressed here are his own.

Partner Center


Digital Innovation and Transformation

Mba student perspectives.

  • Assignments
  • Assignment: Digital Winners & Losers

Netflix’s Growth Alongside Digital Transformation

netflix technology case study

Netflix has used the changing digital landscape to its advantage as its become one of the biggest entertainment companies today.

Netflix has been a big winner with the advancement of digital technologies. To date, they have amassed an astounding 117.6 million international subscribers in nearly 200 countries around the world. This success has come from primarily three reasons: 1) advancements in streaming capabilities accelerated in line or better than expected as Netflix transitioned to a primarily streaming service; 2) the proliferation of mobile phones and tablets as well as the introduction of smart televisions allowed Netflix to be available to its customers at all times; and 3) the change in viewing tastes because of the first two greatly enhanced Netflix’s value proposition.

The first company to think about delivering content via streaming technologies was not, in fact, Netflix, but rather its old competitor Blockbuster. In the early 2000s, Blockbuster formed a team to explore the possibility of having its content library available to its customers online. However, as most people still had dial up internet connections and with broadband just beginning, streaming did not function well enough to serve its customers with adequate connections. When Netflix pivoted from its lucrative DVD mail order business to a service that relied on streaming first, the technology had finally caught up and its customers could watch many of its movies and shows online clearly and completely. The transition happened faster in the United States than Netflix had anticipated, leading to an increasing number of subscribers from which Blockbuster and other services could not match. As internet connections have improved throughout the world, Netflix has continued to ride this trend and provide its international customers with the same, clear experience.

Not only did Netflix have faster connections to its customers’ computers so they could view its content online, mobile technology advanced so that Netflix could function across multiple of its customers’ devices, making it a much more necessary subscription service to have. The success of the iPhone in making Americans want smartphones and the follow on introductions of other smart devices like tablets gave Netflix platforms to have more touches with its customers. In 2010 it launched the Netflix application for iPhones and iPod Touches. Again this accelerated with international growth and the proliferation of smartphones around the world.

Pathways to a Just Digital Future

Finally, as a byproduct of the changing technological landscape, customer tastes changed in the way in which they viewed content and what they were viewing in ways that Netflix took advantage of. Because screens were now available at all times, video became more of an essential part of people’s lives. No longer did people feel they had to sit down in movie theaters or wait for their weekly television shows. Because they were constantly connected to fast internet, they could watch short form video whenever they wanted. Netflix, unlike traditional film and television production studios, was in the prime position to deliver this content. Their data showed what their customers watched, in what increments, and what they desired more of. All this made Netflix the behemoth it currently is.

As Netflix goes forward it will have to continue to use technological advancements to be everywhere consumers watch content and have the content that allows them to keep growing subscribers. But since their origins as a DVD by mail service, they have used digital transformation to grow passed most of their competitors.

Student comments on Netflix’s Growth Alongside Digital Transformation

Interesting post, Ari! Tough to imagine a life without Netflix now! I wonder what the future has in store for the company. When Netflix started streaming, it was providing a product that was pretty easy to replicate and, as a result, we’ve seen a multitude of other content streamers enter the space. As you mentioned, Netflix then moved to original content to differentiate, and many of their competitors followed suit. I remember reading that Disney even pulled their content from Netflix for their own streaming service. Moving forward, there is increased pressure on Netflix to continuously produce great content or find some other way to prevent itself from becoming essentially a commoditized business. It’ll be interesting to see how it plays out!

Thanks Ari! My biggest concerns about Netflix are old people and poor people. In the US, Netflix has super high penetration in the younger crowd but still has a ways to go with the old and the poor. As broadband penetration stops growing, (in the high 80%’s in the US), we are seeing that for some the internet, and therefore Netflix, is either not affordable or important. Although the international growth you describe is encouraging and likely to distract for a while, I fear a meaningful slow down in the US will change the stock’s narrative and bring its valuation back to earth.

Interesting post. It’s been very interesting watching how Netflix has remained nimble and responsive to its users over time. They’ve readily adapted their business model, embracing change and new technology paradigms as they’ve come along. It seems this is largely guided by their data-driven analyses and focus on customer behaviors and experience. It will be further interesting to see how this plays out compared to Amazon Instant Video and other streaming offerings.

Thanks for the post! Whenever we talk about Netflix’s success, I’m always reminded of the power of data. It only helps provide better recommendations to its customers creating a better customer experience, but more importantly, Netflix was able to create original content tailored to customer’s needs. I was surprised to read in an article that Netflix’s efforts in original content have been earning near 100 Emmy nominations in recent years. I’m interested to see how the market changes as more networks provide their own streaming services, as well as more tech companies start to create original content. (For example, Facebook recently launched the Facebook Watch.)

Leave a comment Cancel reply

You must be logged in to post a comment.

  • Explore AI by Industry PLUS
  • Consumer goods
  • Heavy industry
  • Natural resources
  • Professional services
  • Transportation
  • AI Best Practice Guides PLUS
  • AI White Paper Library PLUS
  • AI Business Process Explorer PLUS
  • Enterprise AI Newsletter
  • Emerj Plus Research
  • AI in Business Podcast
  • The AI Consulting Podcast
  • AI in Financial Services Podcast
  • Precisely – Building Trust in Data
  • Shift Technology – How Insurers are Using AI
  • Uniphore – The Future of Banking CX in APAC
  • Uniphore – The Economic Impact of Conversational AI and Automation
  • Uniphore – The Future of Complaints Management
  • Uniphore – Conversational AI in Banking

Artificial Intelligence at Netflix – Two Current Use-Cases


Ryan Owen holds an MBA from the University of South Carolina, and has rich experience in financial services, having worked with Liberty Mutual, Sun Life, and other financial firms. Ryan writes and edits AI industry trends and use-cases for Emerj's editorial and client content.

Artificial Intelligence at Netflix

Netflix launched in 1997 as a mail-based DVD rental business. Alongside the growing US DVD market in the late 1990s and early 2000s, Netflix’s business grew and the company went public in 2002. Netflix posted its first profit a year later. By 2007, Netflix introduced its streaming service, and by 2013, the company began producing original content.

Today, Netflix is one of the world’s largest entertainment services with over 200 million paid memberships spanning 190 countries, according to the company’s 2020 Annual Report . As of January 2022, Netflix trades on the Nasdaq with a market cap that exceeds $260 billion. For the fiscal year ended December 31, 2020, Netflix reported revenues of nearly $25 billion.

The research function at Netflix follows a decentralized model, with “many teams that pursue research in collaboration with business teams, engineering teams, and other researchers,” according to the company’s Netflix Research website, launched in 2018. The company’s research areas include:

  • Machine learning
  • Recommendations
  • Experimentation and causal inference
  • Encoding and quality
  • Computer vision

In this article, we’ll look at how Netflix has explored AI applications for its business and industry through two unique use-cases:

  • Image Personalization for Viewers — Netflix uses artificial intelligence and machine learning to predict which images best engage which viewers as they scroll through the company’s many thousands of titles.
  • AVA: Creating Appropriate Thumbnail Images — Netflix has created AVA to source stills from its many thousands of titles that will eventually become the representative images that the company uses to drive viewer engagement.

We will begin by examining how Netflix has turned to machine learning technology to predict the imagery that will resonate most with viewers when they see suggested titles on their Netflix screens.

Image Personalization for Viewers

Netflix has earned its place in the entertainment industry in large part due to its personalized recommendation system that aims to deliver the titles a viewer most likely wants to see at a given time. However, with its extensive library of over 16,800 titles worldwide, according to research compiled by Flixwatch , a Netflix database site, how does Netflix suggest a title’s relevance to a specific member when they are scrolling through hundreds, or even thousands, of offerings?

Netflix research shows that members will invest approximately one minute scrolling through those offerings before they give up. Before the platform loses that viewer to a competing service—or some other activity altogether—Netflix wants to grab their attention. To do this, they’ve turned to the artwork the platform uses to represent each of its titles.

“Given the enormous diversity in taste and preferences,” Netflix asks , “wouldn’t it be better if we could find the best artwork for each of our members to highlight the aspects of a title that are specifically relevant to them ?”

Netflix uses the video below to show how, without artwork, much of the visual interest—and engagement—of the company’s experience is removed.

To build much of its platform, Netflix has relied heavily on batch machine learning approaches informed by algorithms that reflect A/B testing results. However, when determining which artwork will resonate with which viewers, this approach results in delays during:

  • Data generation
  • Model development
  • A/B testing execution and analysis

To apply image personalization to its library of titles, Netflix has turned to an online machine learning framework called contextual bandits . Through contextual bandits, Netflix claims , the company can “rapidly figure out the optimal personalized artwork solution for a title for each member and context. … by trad[ing] off the cost of gathering training data required for learning an unbiased model on an ongoing basis with the benefits of applying the learned model to each member context.”

Netflix goes on to explain that they obtain the training data through the “injection of controlled randomization in the learned model’s predictions.”

By considering user-specific factors like viewing history and country, Netflix claims to emphasize themes through the artwork it shows as members scroll their screens. Here Netflix’s then-Director of Machine Learning shows how artwork is personalized for a title like “Stranger Things.”

In another example, the Netflix TechBlog explores how an image is chosen that represents the movie, “Good Will Hunting.” The post explains that if a viewer has a viewing history that includes romance movies, they may see a thumbnail image of Matt Damon and Minnie Driver together. If that viewer watches a lot of comedies, however, they may instead be shown a thumbnail image of Robin Williams.

netflix technology case study

Source: Netflix

While our research did not identify specific results related to increased viewings of specific titles due to these technologies, Netflix does disclose that they have realized positive results through their own A/B testing and that the biggest benefits have come from promoting less well-known titles. Given these results, Netflix is now exploring further customization in how it presents its selections to viewers by adapting on-screen areas like:

AVA: Creating Appropriate Thumbnail Images

Before Netflix can choose which thumbnail images best engage which viewers, the company must generate multiple images for each of the thousands of titles the service offers to its members. In the early days of the service, Netflix sourced title images from its studio partners, but soon concluded that these images did not sufficiently engage viewers in a grid format where titles live side by side.

Netflix explains : “Some were intended for roadside billboards where they don’t live alongside other titles. Other images were sourced from DVD cover art which don’t work well in a grid layout in multiple form factors (TV, mobile, etc.).”

As a result, Netflix began to develop their own thumbnail images, or stills from “static video frames” that come from the source content itself, according to the Netflix TechBlog . However, if, for example, a one-hour episode of “Stranger Things” contains some 86,000 static video frames, and each of the show’s first three seasons has eight episodes, Netflix could have more than two million static video frames to analyze and choose from.

Netflix soon concluded that relying on the “in-depth expertise” of human curators or editors in selecting these thumbnail images “presents a very challenging expectation.” To scale its effort to create as many stills as possible for each of its titles, Netflix turned to AVA , “a collection of tools and algorithms designed to surface high quality imagery from the videos on [the] service.”

Netflix states that AVA scans each frame of every title in the Netflix library to evaluate contextual metadata and identify “objective signals” that ranking algorithms then use to identify frames that meet the service’s “aesthetic, creative, and diversity objectives” required before they can qualify as thumbnail images. According to Netflix, these factors include :

  • Face detection, including pose estimation and sentiment analysis
  • Motion estimation, including motion blur and camera movement
  • Camera shot identification, including estimation of cinematographer intent
  • Object detection, including importance determination of non-human subjects

This Frame Annotation process focuses on frames that represent the title and interactions between the characters, while setting aside frames with unfortunate traits like blinking, blurring, or that capture characters in mid-speech, according to a Netflix Research presentation .

netflix technology case study

Source: Netflix TechBlog

To train the underlying Convolutional Neural Network (CNN), Netflix assembled a dataset of some twenty thousand faces (positive and negative examples) from movie artwork, thumbnails, and random movie frames, the company claims .

The CNN also evaluates the prominence of each character by evaluating the frequency with which the character appears by him- or herself and with other characters in the title. This helps “prioritize main characters and de-prioritize secondary characters or extras,” Netflix claims .

Through its analysis, each frame receives a score that represents the strength of its candidacy as a thumbnail image. Per Netflix , AVA considers the following elements when it forms the final list of images that best represent each title:

  • Actors, including prominence, relevance, posture, and facial landmarks
  • Image Diversity, including camera shot types, visual similarity, color, and saliency maps
  • Maturity Filters, including screening for harmful or offensive elements

While our research did not identify any results specific to AVA’s use within Netflix, the company hopes that AVA will save creative teams time and resources as it surfaces the best stills to consider for candidates as thumbnail images and that the technology will drive more and better options to present to viewers during that crucial minute that viewers allow before they lose interest and search for another way to spend their time.

Related Posts

The Walt Disney Company began in 1923 as the Disney Brothers Cartoon Studio. By 1940,…

Toyota came to the United States in the late 1950s, setting up its US headquarters…

AT&T traces its history to 1875 when Bell Telephone was founded soon after Alexander Graham…

Founded in 1977 by a team of engineers led by Larry Ellison, Oracle became the…

General Electric (GE) was founded in 1889 by J.P. Morgan and Anthony J. Drexel who…

Related posts (5)

Artificial Intelligence at Disney

Artificial Intelligence at Disney

The Walt Disney Company began in 1923 as the Disney Brothers Cartoon Studio. By 1940, Walt Disney Productions issued its first stock. Today, the multinational family entertainment and media conglomerate is one of the Big 6 media companies. 

Artificial Intelligence at Toyota

Artificial Intelligence at Toyota

Toyota came to the United States in the late 1950s, setting up its US headquarters in California. A decade later, the Japanese automaker became the third-largest import brand in the United States. In 1968, Toyota introduced the Corolla, now the world’s best-selling passenger car. Today, Toyota is rebranding itself as a mobility company. 

Artificial Intelligence at AT&T

Artificial Intelligence at AT&T – Two Current Use-Cases

AT&T traces its history to 1875 when Bell Telephone was founded soon after Alexander Graham Bell patented his invention of the telephone. Today, AT&T is the world’s largest telecom and the largest mobile telephone service provider in the US. 

Artificial Intelligence at Oracle

Artificial Intelligence at Oracle – Two Current Use-Cases

Founded in 1977 by a team of engineers led by Larry Ellison, Oracle became the world's largest database management company by 1987. Today, Oracle claims a long list of innovations including:

AI at General Electric

Artificial Intelligence and Digital Twins at General Electric

General Electric (GE) was founded in 1889 by J.P. Morgan and Anthony J. Drexel who came together to finance Thomas Edison’s research and merge their companies together. Originally, GE was an industrial and consumer products company but today, more than 130 years later, GE has transformed itself into a multinational, digital industrial corporation ranked as the 33rd largest company in the United States by gross sales in 2020, according to Fortune 500.

  • Market Reasearch and Advisory
  • AI Presentations and Keynotes
  • Emerj Plus Membership
  • AI In Business Podcast
  • AI In Finance Services Podcast
  • Subscribe to our AI Newsletter
  • Advertise with us
  • Terms and Conditions
  • Refund and Cancellation Policy
  • Privacy Policy

netflix technology case study

Book cover

World Conference on Information Systems and Technologies

WorldCIST 2020: Trends and Innovations in Information Systems and Technologies pp 590–599 Cite as

The Power of Digitalization: The Netflix Story

  • Manuel Au-Yong-Oliveira 20 , 21 ,
  • Miguel Marinheiro 20 &
  • João A. Costa Tavares 20  
  • Conference paper
  • First Online: 18 May 2020

4345 Accesses

2 Citations

8 Altmetric

Part of the Advances in Intelligent Systems and Computing book series (AISC,volume 1161)

The evolution of technology, and mainly the evolution of the Internet, has improved the way business is done. Nowadays, most services are offered through a website or through an app, as it is much more convenient and suitable for the customer. This business transformation made it possible to get a faster and cheaper service, and companies had to adapt to the change, in order to fulfill customers’ requirements. In this context, this paper relates to this digital transformation, focusing on a case study about Netflix, a former DVD rental company and currently an online streaming leader. We aimed to understand Netflix’s behavior alongside this digital wave. Thus, we performed a survey, which had 74 answers, mainly from Portugal, but also from Spain, Belgium, Italy, Turkey, Georgia and Malaysia. Of the people who answered the survey, 90.1% were stream consumers, but only 59.1% had premium TV channels. From those 90.1%, 58.3% also said that they watched streams between two and four times per week, but the majority of premium TV channel subscribers (63.8%) replied that they watch TV less than twice in a week. We see a trend in which the traditional TV industry is in decline and streaming as a service has increased in popularity. Consumer habits are changing, and people are getting used to the digitalization era. Netflix is also confirmed in our survey as the market leader of the entertainment distribution business, as stated in the literature, and the biggest strength of this platform is its content.

  • Digital transformation
  • Online streaming

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Leiner, B., Cerf, V., Clark, D., Kahn, R., Kleinrock, L., Lynch, D., Postel, J., Roberts, L.G., Wolff, S.: Brief History of the Internet—Internet Society (2009)

Google Scholar  

Investopedia. https://www.investopedia.com/articles/personal-finance/121714/hulu-netflix-and-amazon-instant-video-comparison.asp . Accessed 03 Dec 2019

Littleton, C., Roettgers, J.: How Netflix Went From DVD Distributor to Media Giant (2018). https://variety.com/2018/digital/news/netflix-streaming-dvds-original-programming-1202910483/ . Accessed 31 Oct 2019

Business Insider. https://www.businessinsider.com/how-netflix-has-looked-over-the-years-2016-4#in-2010-streaming-begins-to-be-more-than-an-add-on-and-gets-prominent-real-estate-on-the-home-page-5 . Accessed 03 Dec 2019

Netflix. https://www.netflix.com/browse . Accessed 03 Dec 2019

Oomen, M.: Netflix: How a DVD rental company changed the way we spend our free time (2019). Business Models Inc. https://www.businessmodelsinc.com/exponential-business-model/netflix/ . Accessed 31 Oct 2019

Venkatraman, N.V.: Netflix: A Case of Transformation for the Digital Future (2017). https://medium.com/@nvenkatraman/netflix-a-case-of-transformation-for-the-digital-future-4ef612c8d8b . Accessed 31 Oct 2019

BMI - Business Models Inc. https://www.businessmodelsinc.com/exponential-business-model/netflix/ . Accessed 03 Dec 2019

Calia, R.C., Guerrini, F.M., Moura, G.L.: Innovation networks: from technological development to business model reconfiguration. Technovation 27 (8), 426–432 (2007)

Article   Google Scholar  

Ritter, T., Lund, C.: Digitization capability and the digitalization of business models in business-to-business firms: past, present, and future. Ind. Mark. Manag. (November), 1–11 (2019)

Hong, S.H.: The recent growth of the internet and changes in household-level demand for entertainment. Inf. Econ. Policy 19 (3–4), 304–318 (2007)

Evens, T.: Clash of TV platforms: how broadcasters and distributors build platform leadership. In: 25th European Regional Conference of the International Telecommunications Society (ITS), Brussels, Belgium, 22–25 June 2014. ECONSTOR (2014)

Aliloupour, N.P.: Impact of technology on the entertainment distribution market: the effects of Netflix and Hulu on cable revenue. Open access senior thesis. Bachelor of Arts. Claremont Graduate University (2015)

Johnson, C.M.: Cutting the cord: leveling the playing field for virtual cable companies. Law School Student Scholarship, Paper 497 (2014)

Pardo, A.: Digital hollywood: how internet and social media are changing the movie business. In: Friedrichsen, M., Muhl-Benninhaus, W. (eds.) Handbook of Social Media Management, pp. 329–348 (2013)

Bryman, A., Bell, E.: Business Research Methods, 4th edn. Oxford University Press, Oxford (2015)

Alvarez, E.: Netflix is taking a wait-and-see approach to virtual reality (2018). Engadget. https://www.engadget.com/2018/03/07/netflix-virtual-reality-not-a-priority/ . Accessed 31 Oct 2019

Nhan, J., Bowen, K., Bartula, A.: A comparison of a public and private university of the effects of low-cost streaming services and income on movie piracy. Technol. Soc. 60 , 101213 (2020)

Comissão Europeia - Portugal – A PAC no seu país. https://ec.europa.eu/info/sites/info/files/food-farming-fisheries/by_country/documents/cap-in-your-country-pt_pt.pdf . Accessed 20 Jan 2020

Gonçalves, R., Oliveira, M.A.: Interacting with technology in an ever more complex world: designing for an all-inclusive society. In: Wagner, C.G. (ed.) Strategies and Technologies for a Sustainable Future, pp. 257–268. World Future Society, Boston (2010)

Fontoura, A., Fonseca, F., Piñuel, M.D.M., Canelas, M.J., Gonçalves, R., Au-Yong-Oliveira, M.: What is the effect of new technologies on people with ages between 45 and 75? In: Rocha, Á., et al. (eds.) New Knowledge in Information Systems and Technologies, WorldCist 2019, La Toja Island, Spain, 16–19 April. Advances in Intelligent Systems and Computing (Book of the AISC Series), vol. 932, pp. 402–414. Springer (2019)

Download references

Author information

Authors and affiliations.

Department of Economics, Management, Industrial Engineering and Tourism, University of Aveiro, 3810-193, Aveiro, Portugal

Manuel Au-Yong-Oliveira, Miguel Marinheiro & João A. Costa Tavares

GOVCOPP, Aveiro, Portugal

Manuel Au-Yong-Oliveira

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Manuel Au-Yong-Oliveira .

Editor information

Editors and affiliations.

Departamento de Engenharia Informática, Universidade de Coimbra, Coimbra, Portugal

Álvaro Rocha

College of Engineering, The Ohio State University, Columbus, OH, USA

Hojjat Adeli

FEUP, Universidade do Porto, Porto, Portugal

Luís Paulo Reis

DIMES, Università della Calabria, Arcavacata, Italy

Sandra Costanzo

Faculty of Electrical Engineering, University of Montenegro, Podgorica, Montenegro

Irena Orovic

Universidade Portucalense, Porto, Portugal

Fernando Moreira

Rights and permissions

Reprints and permissions

Copyright information

© 2020 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Cite this paper.

Au-Yong-Oliveira, M., Marinheiro, M., Costa Tavares, J.A. (2020). The Power of Digitalization: The Netflix Story. In: Rocha, Á., Adeli, H., Reis, L., Costanzo, S., Orovic, I., Moreira, F. (eds) Trends and Innovations in Information Systems and Technologies. WorldCIST 2020. Advances in Intelligent Systems and Computing, vol 1161. Springer, Cham. https://doi.org/10.1007/978-3-030-45697-9_57

Download citation

DOI : https://doi.org/10.1007/978-3-030-45697-9_57

Published : 18 May 2020

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-45696-2

Online ISBN : 978-3-030-45697-9

eBook Packages : Intelligent Technologies and Robotics Intelligent Technologies and Robotics (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • Bahasa Indonesia
  • Sign out of AWS Builder ID
  • AWS Management Console
  • Account Settings
  • Billing & Cost Management
  • Security Credentials
  • AWS Personal Health Dashboard
  • Support Center
  • Expert Help
  • Knowledge Center
  • AWS Support Overview
  • AWS re:Post

Additional Resources

Get started.

Organizations of all sizes across all industries are transforming their businesses and delivering on their missions every day using AWS. Contact our experts and start your own AWS journey today.

deprecated-browser pixel tag

Ending Support for Internet Explorer

CBS Research Portal Logo

Disruptive Innovation: A Case Study on How Netflix is Transforming the Living Room

Student thesis : Master thesis

Innovation has always been a crucial factor in business strategy across various market segments. In light of the digitalization revolution, the entertainment industry has been affected greatly, both in positive and negative ways. Long standing market incumbents such as Blockbuster have felt the disruptive shift of a new market player, Netflix. Its disruptively innovative strategy was simple enough to cater to small consumer segments, while rapidly gaining market traction. Eventually Netflix disrupted not only the market giant Blockbuster, but also consumers’ living rooms. Clayton M. Christiansen’s theory on disruptive innovation provides context and guidelines in better understanding the differences between sustaining innovation and disruptive innovation. Furthermore, it reflects over “The Innovators Dilemma” where, innovators must decide how to best invest their resources so as not to loose market share. This Thesis aims to better understand the effects of disruptive innovation within the entertainment content industry. The research utilizes a case study approach, using Netflix as the case company. Due to technological advancements the TV and entertainment content industry has drastically changed with new methods of consuming content, and new business models to disrupt the market. Having disrupted the market, Netflix remains a leading force among consumers. Moreover, in recent years, the competition within the market has radically increased. The project goes on to explore Netflix’s possible outcomes for future markets.

Documents & Links

File : application/pdf, 4.75 MB

Type : Text file


  • Product Engineering And Development Simform acts as a strategic software engineering partner to build products designed to cater the unique requirements of each client. From rapid prototyping to iterative development, we help you validate your idea and make it a reality.
  • Performance Engineering and Testing Our service portfolio offers a full spectrum of world-class performance engineering services. We employ a dual-shift approach to help you plan capacity proactively for increased ROI and faster delivery.
  • Digital Experience Design Work with cross-functional teams of smart designers and product visionaries to create incredible UX and CX experiences. Simform pairs human-centric design thinking methodologies with industry-led tech expertise to transform user journeys and create incredible digital experience designs.
  • Application Management and Modernization Simform’s application modernization experts enable IT leaders to create a custom roadmap and help migrate to modern infrastructure using cloud technologies to generate better ROI and reduce cloud expenditure.
  • Project Strategy At Simform, we don’t just build digital products, but we also define project strategies to improve your organization’s operations. We use Agile software development with DevOps acceleration, to improve the software delivery process and encourage reliable releases that bring exceptional end-user experience.
  • AWS DevOps Consulting Develop scalable cloud-native apps with ease
  • AWS SaaS Development Build competitive SaaS apps with best experts & tools
  • AWS Data Engineering Collect, process, and analyze data with AWS data engineering tools
  • AWS Cloud Migration Scale your infrastructure with AWS cloud migration
  • Digital Customer Experience (DCX) Create seamless customer experiences with AWS
  • Well-Architected Framework (WAR) Leverage AWS best practices to optimize your infrastructure
  • AWS Immersion Day Program Tailored training to maximize the use of AWS platform
  • Modern Cloud Operations with EKS Build, manage, and scale containerized applications with ease
  • Cloud Native App Development Build, test, deploy, and scale on the cloud
  • Cloud Consulting Audit cloud infrastructure, optimize cost and maximize cloud ROI
  • Microservice Architecture Remodel your app into independent and scalable microservices
  • Kubernetes Consulting Container orchestration made simple
  • Cloud Migration Consulting Assess, discover, design, migrate and optimize the cloud workloads
  • Cloud Assessment Assess cloud spending, performance, and bottlenecks
  • Serverless Seize the power of auto-scaling and reduced operational costs
  • Cloud Architecture Design Optimize your architecture to scale effectively
  • DevOps Consulting DevOps implementation strategies to accelerate software delivery
  • Infrastructure Management and Monitoring Competently setup, configure, monitor and optimize the cloud infrastructure
  • Containerization and Orchestration Reliably manage the lifecycle of containers in large and dynamic environments
  • Infrastructure as a Code Manage and provision IT infrastructure though code
  • CI/CD Implementation Automate and efficiently manage complex software development
  • BI and Data Engineering Our Data and BI experts help you bridge the gap between your data sources and business goals to analyze and examine data, gather meaningful insights, and make actionable business decisions.
  • Test Automation Reduce manual testing and focus on improving the turnaround time
  • Microservice Testing Make your microservices more reliable with robust testing
  • API Testing Build safer application and system integrations
  • Performance Testing Identify performance bottlenecks and build a stable product
  • Load Testing Achieve consistent performance under extreme load conditions
  • Security Testing Uncover vulnerabilities and mitigate malicious threats
  • Technology Partnerships Reap benefits of our partnerships with top infrastructure platforms
  • Process Management Right processes to deliver competitive digital products
  • Technology Comparisons
  • How it works

How Netflix Became A Master of DevOps? An Exclusive Case Study

Find out how Netflix excelled at DevOps without even thinking about it and became a gold standard in the DevOps world.

netflix technology case study

Table of Contents

  • Netflix's move to the cloud

Netflix’s Chaos Monkey and the Simian Army

Netflix’s container journey, netflix’s “operate what you build” culture, lessons we can learn from netflix’s devops strategy, how simform can help.

Even though Netflix is an entertainment company, it has left many top tech companies behind in terms of tech innovation. With its single video-streaming application, Netflix has significantly influenced the technology world with its world-class engineering efforts, culture, and product development over the years.

One such practice that Netflix is a fantastic example of is DevOps. Their DevOps culture has enabled them to innovate faster, leading to many business benefits. It also helped them achieve near-perfect uptime, push new features faster to the users, and increase their subscribers and streaming hours.

With nearly 214 million subscribers worldwide and streaming in over 190 countries , Netflix is globally the most used streaming service today. And much of this success is owed to its ability to adopt newer technologies and its DevOps culture that allows them to innovate quickly to meet consumer demands and enhance user experiences. But Netflix doesn’t think DevOps.

So how did they become the poster child of DevOps? In this case study, you’ll learn about how Netflix organically developed a DevOps culture with out-of-the-box ideas and how it benefited them.

Simform is a leading DevOps consulting and implementation company , helping businesses build innovative products that meet dynamic user demands efficiently. To grow your business with DevOps, contact us today!

Netflix’s move to the cloud

It all began with the worst outage in Netflix’s history when they faced a major database corruption in 2008 and couldn’t ship DVDs to their members for three days. At the time, Netflix had roughly 8.4 million customers and one-third of them were affected by the outage. It prompted Netflix to move to the cloud and give their infrastructure a complete makeover. Netflix chose AWS as its cloud partner and took nearly seven years to complete its cloud migration.

Netflix didn’t just forklift the systems and dump them into AWS. Instead, it chose to rewrite the entire application in the cloud to become truly cloud-native, which fundamentally changed the way the company operated. In the words of Yury Izrailevsky, Vice President, Cloud and Platform Engineering at Netflix:

“We realized that we had to move away from vertically scaled single points of failure, like relational databases in our datacenter, towards highly reliable, horizontally scalable, distributed systems in the cloud.”

As a significant part of their transformation, Netflix converted its monolithic, data center-based Java application into cloud-based Java microservices architecture. It brought about the following changes:

  • Denormalized data model using NoSQL databases
  • Enabled teams at Netflix to be loosely coupled
  • Allowed teams to build and push changes at the speed that they were comfortable with
  • Centralized release coordination
  • Multi-week hardware provisioning cycles led to continuous delivery
  • Engineering teams made independent decisions using self-service tools

As a result, it helped Netflix accelerate innovation and stumble upon the DevOps culture. Netflix also gained eight times as many subscribers as it had in 2008. And Netflix’s monthly streaming hours also grew a thousand times from Dec 2007 to Dec 2015.

netflix streaming hours graph

After completing their cloud migration to AWS by 2016, Netflix had:

netflix after cloud migration

And it handled all of the above with 0 Network Ops Centers and some 70 operations engineers, who were all software engineers focusing on writing tools that enabled other software developers to focus on things they were good at.

Migrating to the cloud made Netflix resilient to the kind of outages it faced in 2008. But they wanted to be prepared for any unseen errors that could cause them equivalent or worse damage in the future.

Engineers at Netflix perceived that the best way to avoid failure was to fail constantly. And so they set out to make their cloud infrastructure more safe, secure, and available the DevOps way – by automating failure and continuous testing.

Chaos Monkey

Netflix created Chaos Monkey, a tool to constantly test its ability to survive unexpected outages without impacting the consumers. Chaos Monkey is a script that runs continuously in all Netflix environments, randomly killing production instances and services in the architecture. It helped developers:

  • Identify weaknesses in the system
  • Build automatic recovery mechanisms to deal with the weaknesses
  • Test their code in unexpected failure conditions
  • Build fault-tolerant systems on day to day basis

The Simian Army

After their success with Chaos Monkey, Netflix engineers wanted to test their resilience to all sorts of inevitable failures, detect abnormal conditions. So, they built the Simian Army , a virtual army of tools discussed below.

the simian army netflix

  • Latency Monkey

It creates false delays in the RESTful client-server communication layers, simulating service degradation and checking if the upstream services respond correctly. Moreover, creating very large delays can simulate an entire service downtime without physically bringing it down and testing the ability to survive. The tool was particularly useful to test new services by simulating the failure of dependencies without affecting the rest of the system.

  • Conformity Monkey

It looks for instances that do not adhere to the best practices and shuts them down, giving the service owner a chance to re-launch them properly.

  • Doctor Monkey

It detects unhealthy instances by tapping into health checks running on each instance and also monitors other external health signs (such as CPU load). The unhealthy instances are removed from service and terminated after service owners identify the root cause of the problem.

  • Janitor Monkey

It ensures the cloud environment runs without clutter and waste. It also searches for unused resources and discards them.

  • Security Monkey

An extension of Conformity Monkey, it identifies security violations or vulnerabilities (e.g., improperly configured AWS security groups) and eliminates the offending instances. It also ensures the SSL (Secure Sockets Layer) and DRM (Digital Rights Management) certificates were valid and not due for renewal.

  • 10-18 Monkey

Short for Localization-Internationalization, it identifies configuration and runtime issues in instances serving users in multiple geographic locations with different languages and character sets.

  • Chaos Gorilla

Like Chaos Monkey, the Gorilla simulates an outage of a whole Amazon availability zone to verify if the services automatically re-balance to the functional availability zones without manual intervention or any visible impact on users.

Today, Netflix still uses Chaos Engineering and has a dedicated team for chaos experiments called the Resilience Engineering team (earlier called the Chaos team).

In a way, Simian Army incorporated DevOps principles of automation, quality assurance, and business needs prioritization. As a result, it helped Netflix develop the ability to deal with unexpected failures and minimize their impact on users. 

On 21st April 2011 , AWS experienced a large outage in the US East region, but Netflix’s streaming ran without any interruption. And on 24th December 2012 , AWS faced problems in Elastic Load Balancer(ELB) services, but Netflix didn’t experience an immediate blackout. Netflix’s website was up throughout the outage, supporting most of their services and streaming, although with higher latency on some devices.

Netflix had a cloud-native, microservices-driven VM architecture that was amazingly resilient, CI/CD enabled, and elastically scalable. It was more reliable, with no SPoFs (single points of failure) and small manageable software components. So why did they adopt container technology? The major factors that prompted Netflix’s investment in containers are:

  • Container images used in local development are very similar to those run in production. This end-to-end packaging allows developers to build and test applications easily in production-like environments, reducing development overhead.
  • Container images help build application-specific images easily.
  • Containers are lightweight, allowing building and deploying them faster than VM infrastructure.
  • Containers only have what a single application needs, are smaller and densely packed, which reduces overall infrastructure cost and footprint.
  • Containers improve developer productivity, allowing them to develop, deploy, and innovate faster.

Moreover, Netflix teams had already started using containers and seen tangible benefits. But they faced some challenges such as migrating to containers without refactoring, ensuring seamless connectivity between VMs and containers, and more. As a result, Netflix designed a container management platform called Titus to meet its unique requirements.

Titus provided a scalable and reliable container execution solution to Netflix and seamlessly integrated with AWS. In addition, it enabled easy deployment of containerized batches and service applications.

netflix titus

Titus served as a standard deployment unit and a generic batch job scheduling system. It helped Netflix expand support to growing batch use cases. 

  • Batch users could also put together sophisticated infrastructure quickly and pack larger instances across many workloads efficiently. Batch users could immediately schedule locally developed code for scaled execution on Titus.
  • Beyond batch, service users benefited from Titus with simpler resource management and local test environments consistent with production deployment.
  • Developers could also push new versions of applications faster than before.

Overall, Titus deployments were done in one or two minutes which took tens of minutes earlier. As a result, both batch and service users could experiment locally, test quickly and deploy with greater confidence than before.

“The theme that underlies all these improvements is developer innovation velocity.” 

-Netflix tech blog

This velocity enabled Netflix to deliver fast features to the customers, making containers extremely important for their business.

Netflix invests and experiments significantly in improving development and operations for the engineering teams. But before Netflix adopted the “Operate what you build” model, it had siloed teams. The Ops teams focused on deploy, operate and support parts of the software life cycle. And Developers handed off the code to the ops team for deployment and operation. So each stage in the SDLC was owned by a different person and looked like this:

specialized roles at netflix

The specialized roles created efficiencies within each segment but created inefficiencies across the entire SDLC. The issues that they faced were:

  • Individual silos that slowed down end-to-end progress
  • Added communication overhead, bottlenecks and hampered effectiveness of feedback loops
  • Knowledge transfers between developers and ops/SREs were lossy
  • Higher time-to-detect and time-to-resolve for deployment problems
  • Longer gaps between code complete and deployment, with releases taking weeks

Operate what you build

To deal with the above challenges and drawing inspiration from DevOps principles, Netflix encouraged shared ownership of the full SDLC and broke down silos. The teams developing a system were responsible for operating and supporting it. Each team owned its own deployment issues, performance bugs, alerting gaps, capacity planning, partner support, and so on.

operate what you build at netflix

Moreover, they also introduced centralized tooling to simplify and automate dealing with common development problems of the teams. When additional tooling needs arise, the central team assesses if the needs are common across multiple development teams and built tools. In case of too team-specific problems, the development team decides if their need is important enough to solve on their own.

centralized tooling at netflix

Full Cycle Developers

Combining the above ideas, Netflix built an even better model where dev teams are equipped with amazing productivity tools and are responsible for the entire SDLC, as shown below.

full cycle developers at netflix

Netflix provided ongoing training and support in different forms (e.g., dev boot camps) to help new developers build up these skills. Easy-to-use tools for deployment pipelines also helped the developers, e.g., Spinnaker. It is a Continuous Delivery platform for releasing software changes with high velocity and confidence.

However, such models require a significant shift in the mindsets of teams/developers. To apply this model outside Netflix, you can start with evaluating what you need, count costs, and be mindful of bringing in the least amount of complexities necessary. And then attempt a mindset shift.

Netflix practices are unique to their work environment and needs and might not suit all organizations. But here are a few lessons to learn from their DevOps strategy and apply:

  • Don’t build systems that say no to your developers

Netflix has no push schedules, push windows, or crucibles that developers must go through to push their code into production. Instead, every engineer at Netflix has full access to the production environment. And there are neither strict policies nor procedures that prevent them from accessing the production environment.

  • Focus on giving freedom and responsibility to the engineers

Netflix aims to hire intelligent people and provide them with the freedom to solve problems in their own way that they see as best. So it doesn’t have to create artificial constraints and guardrails to predict what their developers need to do. But instead, hire people who can develop a balance of freedom and responsibility.

  • Don’t think about uptime at all costs

Netflix servers their millions of users with a near-perfect uptime. But it didn’t think about uptime when they started chaos testing their environment to deal with unexpected failure.

  • Prize the velocity of innovation

Netflix wants its engineers to do fun, exciting things and develop new features to delight its customers with reduced time-to-market.

  • Eliminate a lot of processes and procedures

They limit an organization from moving fast. So instead, Netflix focuses on hiring people they can trust and have independent decision-making capabilities.

  • Practice context over control

Netflix doesn’t control and contain too much. What they do focus on is context. Managers at Netflix ensure that their teams have a quality and constant flow of context of the business, rather than controlling them.

  • Don’t do a lot of required standards, but focus on enablement

Teams at Netflix can work with their choice of programming languages, libraries, frameworks, or IDEs as they see best. In addition, they don’t have to go through any research or approval processes to rewrite a portion of the system.

  • Don’t do silos, walls, and fences

Netflix teams know where they fit in the ecosystem, their workings with other teams, dependents, and dependencies. There are no operational fences over which developers can throw the code for production.

  • Adopt “you build it, you run it” culture

Netflix focuses on making ownership easy. So it has the “operate what you build” culture but with the enablement idea that we learned about earlier.

  • Focus on data

Netflix is a data-driven, decision-driven company. It doesn’t do guesses or fall victim to gut instincts and traditional thinking. It invests in algorithms and systems that combs enormous amounts of data quickly and notify when there’s an issue.

  • Always put customer satisfaction first

The end goal of DevOps is to make customer-driven and focus on enhancing the user experience with every release.

  • Don’t do DevOps, but focus on the culture

At Netflix, DevOps emerged as the wonderful result of their healthy culture, thinking and practices.


Get in Touch

Netflix has been a gold standard in the DevOps world for years, but copy-pasting their culture might not work for every organization. DevOps is a mindset that requires molding your processes and organizational structure to continuously improve the software quality and increase your business value. DevOps can be approached through many practices such as automation, continuous integration, delivery, deployment, continuous testing, monitoring, and more.

At Simform, our engineering teams will help you streamline the delivery and deployment pipelines with the right DevOps toolchain and skills. Our DevOps managed services will help accelerate the product life cycle, innovate faster and achieve maximum business efficiency by delivering high-quality software with reduced time-to-market.

' src=

Hiren Dhaduk

Hiren is VP of Technology at Simform with an extensive experience in helping enterprises and startups streamline their business performance through data-driven innovation.

Cancel reply

Your email address will not be published. Required fields are marked *

Your comment here*

Sign up for the free Newsletter

For exclusive strategies not found on the blog

Sign up today!

Related Posts

Kubernetes architecture diagram

Kubernetes Architecture and Components with Diagram

' src=

11 Powerful Docker Alternatives to Revolutionize Containerization in 2024

DevOps CICD and Containerization

DevOps, CI/CD and Containerization: 44 Images Explaining a Winning Trio


Table of Contents

Netflix target audience , what are the key principles of netflix marketing, marketing strategy of netflix, digital marketing strategy of netflix, 5 key takeaways from netflix marketing approach, conclusion , a case study on netflix marketing strategy.

A Case Study on Netflix Marketing Strategy

Netflix was founded in 1997, offering online movie rentals with less than 1000 titles. Soon, it switched to the subscriber-based model, and in 2000 Netflix introduced a personalized movie recommendation system. By 2005 Netflix had over 4.2 million subscribers and started work on a video recommendation algorithm. And finally, in 2007, Netflix began its streaming services and original content creation. By 2016 Netflix had over 50 million subscribers; the story continues today as it is a worldwide presence in the video-on-demand industry. 

Become a Certified Marketing Expert in 8 Months

Become a Certified Marketing Expert in 8 Months

Netflix marketing strategy is undoubtedly a guide for digital marketers worldwide. It is a learning experience to know how this digital media streaming company outperformed all others in the market. 

Netflix's target market is young, tech-savvy users and anyone with digital connectivity. The audience of Netflix is from diverse age groups and demographics. 

However, most of the audience are teenagers, college-goers, entrepreneurs, working professionals, etc. Netflix aggressively works on content expansion and personalization to expand the user base. They separate the kids' and adults' audiences based on their maturity levels. 

Netflix is a fantastic example of an integrating marketing strategy . It is integrated, agile, and customer-driven to make the maximum impact. Netflix follows a customer-centric model to deliver a seamless experience. The platform follows integrated marketing for effective targeting and makes the best use of content marketing for data analytics. 

  • Customer-centricity: Netflix focuses on creating a solid connection with its customers by engaging them personally and personalizing their viewing experience. They also use clever marketing tactics to get people to watch their shows.
  • Integrated viewing experience: Multi-device and up-to-date no matter where you view it from, makes the experience combined.
  • Innovation: Modern marketers must use data analytics to create experiences that delight consumers. Netflix uses customer data analytics to get content recommendations because it knows which movies its customers like to watch. For example, if a Netflix user likes Rocky, it will also offer them sports documentaries. As you manage your business, you, too, need to use data analytics for effective marketing and website optimization.  

Netflix uses data-driven and customer-centric marketing strategies that work in the digital age. Netflix's success relies on constant analysis and optimization, so you can use these tools for marketing your business online.

Netflix's marketing strategy is a surefire example of innovation and modern-day technology growth. The platform has been eager to bring the changes per market need or user demand. The evolution of the marketing tactics from time to time is one of the core reasons behind its success. 

Netflix proves that a brand can connect with customers easily through regular analysis and optimization. Simply put, Netflix's advertising strategy is full of agility, data-collection, user-centricity, personalization, and dedication. Major and minor brands can follow such a strategy and boost brand exposure and market value. 

Let's walk through 5 effective strategies of Netflix's advertising strategy that led them to the most disruptive business model. 

1. Use Personalized Content

Netflix is an excellent example of how personalized content can improve user satisfaction. Netflix knows what TV shows and movies its users like to watch. It uses this information to create customized recommendations for them. This allows them to find the content they enjoy without searching through many lists. It also ensures that users are always getting the latest and greatest content. This level of personalization is critical for online users because it enhances their experience and makes them more likely to return to a site in the future. 

2. Ensure Multi-mode Experience

Starting with a DVD service, Netflix's journey has been successful because of its multi-device strategy. You can open Netflix on TV, computer, smartphone, and tablet with seamless content continuity being watched. The company shows zero restriction in meeting the customers wherever required. Netflix follows both online and offline promotion strategies to boost user engagement. Be it any medium; their marketing strategy remains aligned wherever it can work. 

3. Blend Technology With Marketing Tactic

You wouldn't find two Netflix accounts with the same interface or suggestions. The recommendation shows order is as per user activity and ever-changing. They change the artwork frequently to add a sense of newness. Netflix puts modern-day technology to good use. The platform keeps on having new features to gain maximum engagement. Machine learning is a proven technology trend to transform marketing research to the next level. The blend of ML into advertising is what helps Netflix Marketing Strategy. 

4. Target Emails Like Any Other Marketing Channel

It is wrong to say or consider that email marketing is dead. Netflix is one solid example of a company making the most out of email marketing. They are one step ahead and pairing the email campaigns with machine learning systems. It helps gather more user data and preferences—further, the data segments into multiple user groups for precise and effective customer targeting. So, email marketing can introduce Netflix to new users and show relevant recommendations to the old users. One essential tip from Netflix email marketing is to be creative and take risks. Those old boring emails wouldn't help get such an impact as Netflix today. 

5. Create a Buzz With Better Interactions

Netflix has used the best content marketing strategy in the last decade. The company thinks of an out-of-the-box way to grab quick attention from users. They are bringing standalone products and unmatched experiences. On top of everything, the platform has a seamless communication channel to boost momentary awareness and recognition. The platform allows the audience to be involved in the story and make decisions. This unpredictable move is a proven game-changer for revolutionizing future television. The incomparable buzz in the platform keeps the user stuck to binge-watching. The users feel high engagement in the hopes of finding a happy ending. 

Hence, Netflix happens to be a unique example and inspiration for many fellow companies. They have done a commendable job in content, branding, business model, and product. Netflix marketing strategy has a lot to offer to market enthusiasts and students.

Learn about such integrated marketing strategies with Simplilearn's PGP Digital Marketing Certification Program . You will be taught by Facebook and Purdue University experts, providing a holistic learning experience. Sign-up now and make yourself - job ready! 

Our Digital Marketing Courses Duration And Fees

Digital Marketing Courses typically range from a few weeks to several months, with fees varying based on program and institution.

Recommended Reads

Digital Marketing Career Guide: A Playbook to Becoming a Digital Marketing Specialist

Netflix Recommendations: How Netflix Uses AI, Data Science, and ML

12 Powerful Instagram Marketing Strategies To Follow in 2021

Introductory Digital Marketing Guide

Walmart Marketing Strategy

What is Digital Marketing and How Does It Work?

Get Affiliated Certifications with Live Class programs

Post graduate program in digital marketing.

  • Joint Purdue-Simplilearn Digital Marketer Certificate
  • Become eligible to be part of the Purdue University Alumni Association
  • PMP, PMI, PMBOK, CAPM, PgMP, PfMP, ACP, PBA, RMP, SP, and OPM3 are registered marks of the Project Management Institute, Inc.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 27 February 2024

Construction of knowledge constraints: a case study of 3D structural modeling

  • Xinran Xu 2 &
  • Bingbin Zhang 1  

Scientific Reports volume  14 , Article number:  4704 ( 2024 ) Cite this article

Metrics details

The uncertainty of structural interpretation complicates the practical production and application of data-driven complex geological structure modeling technology. Intelligent structural modeling excavates and extracts structural knowledge from structural interpretation through human–machine collaboration and combines structural interpretation to form a new model of complex structural modeling guided by knowledge. Specifically, we focus on utilizing knowledge rule reasoning technology to extract topological semantic knowledge from interpretive data and employ knowledge inference to derive structural constraint information from complex geological structure models, thus effectively constraining the 3D geological structure modeling process. To achieve this, we develop a rule-based knowledge inference system that derives theoretical models consistent with expert cognition from interpretive data and prior knowledge. Additionally, we represent the extracted knowledge as a topological semantic knowledge graph, which facilitates computer recognition and allows estimation of intersection lines during 3D geological modeling, resulting in the creation of accurate models. The applicability of our proposed method to various complex geological structures is validated through application tests using real-world data. Furthermore, our method effectively supports the realization of intelligent structure modeling in real working area.


Structural modeling is usually not the ultimate goal but supports the numerical and physical simulation of complex phenomena (such as seismic propagation and fluid migration), depth domain imaging, lithology interpretation, and reservoir modeling. The three-dimensional model of the underground structure visually shows the geometric shape and spatial relationship between underground geological interfaces and geological bodies, such as strata and faults 1 , 2 , 3 , Based on structural modeling, sequence and attribute modeling can be performed to directly support reserve calculations, well location deployment, and oil and gas development plan formulation. This is one of the most important tasks in underground resource exploration and development 4 , 5 . In these areas, the quality of seismic data is often poor, and the relationship between key stratum reflection and stratum contact in the seismic profiles is unclear 6 . The transformation of the multistage tectonic movement led to strong deformation of the rock mass, forming a very complex underground structure. Such cases often complicate the acquisition of high-quality structural interpretations, causing considerable uncertainty in the traditional data-driven structural modeling methods 7 , 8 , 9 . The presence of uncertainties can make it challenging to establish a direct link between the geometry of a 3D structural model and the corresponding geological interface 10 , 11 . Because of the high cost of obtaining interpretation data in actual structural modeling, only limited data can be obtained in a certain research area, which requires more expert experience and interpretation to construct a relatively accurate stratigraphic model 1 , 12 , 13 .

Based on the above questions, Zhan et al. constructed the geometric constraints of a structural model through a knowledge graph, which they used to characterize the constraint relationship among knowledge. When the experts failed to comprehend the structural model, a quality assessment was conducted by modifying the knowledge graph to avoid repeated modeling operations 14 . From the graph perspective, knowledge graph is conceptual network and symbolic expression of the physical world. Its nodes represent entities in the real world, and the edges connected by entities represent the semantic links between entities 15 , 16 . Knowledge reasoning often involves overcoming two challenges: the difficulty of obtaining data, which results in sparse and uneven distribution of data samples, and the complexity of spatial relationships between structural elements 17 . These challenges complicate the accurate expression of geological structures using only text data, and the problem of missing structures is sometimes encountered. To address the above issues, this study proposes a process of constructing constraint information of complex geological structure modeling based on knowledge reasoning. The study aims to establish a large-scale structural modeling knowledge base, which is used to fuse the topological position relationship of geological surface space elements and provide technical support for 3D geological structure modeling technology, as shown in Fig.  1 . The construction of a knowledge graph is divided into three stages: (1) conversion of geological data into constraints of a topology knowledge graph, (2) mining of entity and relationship information in geological data through knowledge reasoning, and (3) expert determination of the reliability of the knowledge graph using the wire-frame model.

figure 1

Knowledge constraint construction process.

This study makes the following contributions:

We propose a framework and process to obtain modeling constraint information by knowledge inference of complex geological structure models (Fig.  1 ), transform the knowledge of geological experts into knowledge graph data structures that can be recognized by computers, and represent them into wire frame models that can be recognized by experts.

We construct a common knowledge reasoning rule base in the field of structural modeling and introduce the semantic information of the geological structure into the topological network of the structural model.

We have demonstrated that our approach can effectively deal with real world work and avoid modifying the original data, but change the knowledge in the knowledge base, and improve the accuracy and robustness of the modeling by introducing expert knowledge.

The remainder of this article follows the following organizational structure. “ Materials and methods ” section briefly describes the relevant research methods of this study. In “ Complex geological structure knowledge reasoning ” section, the research methods and processes of constructing constraint knowledge of topological geological modeling through knowledge inference are introduced in detail. “ Result ” section shows the application of the proposed framework to a field case. “ Discussion ” section discusses the feasibility of the proposed framework in comparison with existing approaches. The last section will summarize the work of this study.

Materials and methods

As far as we know, only Zhan et al. 14 have introduced knowledge into the study of structural modeling at present, and there is no relevant study on the constraint information of knowledge inference to construct modeling. In this section we will discuss the following two types of research work that are relevant to this paper: knowledge reasoning and explicit modeling 18 .

Explicit structural modeling

Explicit interactive modeling is a classical 3D modeling method to reconstruct 3D geological structures using sparse data. When modeling complex regions containing various types of geological structures, the limitations of using a single method show up. In display modeling, people introduce the workflow of building closed geological models by introducing multi-source data information to constrain 3D geological models. First, all geological surfaces are reconstructed, and then the intersections between them are found by cutting each other under constraints. Most of the subsequent explicit modeling approaches use this workflow. Display modeling allows a large number of interactive modifications, adding appropriate underground control constraints according to the experience of the modeler, but the amount of interaction is very large and prone to topological contradictions 19 . The square deterministic modeling of structural modeling mainly includes the following methods, namely reservoir seismology methods 20 , reservoir sedimentological methods 21 and kriging interpolation methods 22 . It includes some discrete techniques commonly used in software 23 , 24 , 25 , 26 . Yan et al. shown that spatial explicit models produce better results than non-spatial models, thus showing that space is indeed special in terms of summary 27 . In order to build a more reasonable 3D geological structure model, but now lack of understanding of the specific structure. In display modeling, where it is difficult to choose a way to integrate all types of information, geological histories are used to combine multiple scalar fields, merely showing the geological interfaces cutting each other. We focus on introducing knowledge reasoning into structural modeling to construct structural models, and introducing earth scientists' cognition into display modeling to provide more comprehensive and intuitive structural models. We provide a workflow to construct modeling constraints by knowledge reasoning, and improve the stability and efficiency of display modeling by introducing knowledge reasoning. Figure  2 provides constraint information for the pattern layer.

figure 2

Knowledge reasoning of model constraints

The model constraint can be represented by knowledge graph in data form the construction of the knowledge constraints of the structural model starts from the application field and determines the scope of knowledge constraints 28 . The key to constructing the data layer of the knowledge graph is knowledge reasoning 29 . New insight is obtained through knowledge reasoning, and the given knowledge graph is inferred based on expert knowledge to determine whether it conforms to cognition for updating the rule base of the pattern layer of the knowledge graph.

The target world can be described based on the relationships between entities. Based on this information, the data are not sufficiently meaningful. Relevant data were combined to form the information. Semantics comprise two components: data and relevance. When describing a structural model, the semantic description encompasses both low-level features, such as geometric properties, and high-level features, such as logical relationships between structural elements. By utilizing basic semantic entities, the structural geological model is divided into geometric units, and their spatial topological relationships are described through the relationships between geometric elements. The semantic reasoning between spatial topological geometries yields the construction process of a knowledge graph 30 .

The semantic entities in the modeling knowledge graph refer to geometric objects. Their basic semantic entities can be divided into four types: point (0-cell), line (1-cell), surface (2-cell), and body (3-cell). Relation refers to the topological geometric, positional, and compositional relationships between two entities (including the relationships between target entities). An attribute refers to the position and closure of geometric objects in structural modeling 14 .

Definition 1

( Topological semantic knowledge graph ) The creation and update of topological semantic knowledge graphs can be recorded using meta-knowledge, thereby enabling evolution analysis and traceability of complex geological structural knowledge.

where \(E,V\) represent the basic elements of the topological semantic knowledge graph. Usually, they are expressed in the form of “head entity, relationship between entities, tail entity”. \(GeoMetaK\) represents the meta-knowledge of the topological semantic knowledge graph., It is usually used to indicate the updating of knowledge, \(P_{T}\) and \(P_{L}\) are used to represent the temporal and spatial relationships of topological semantic knowledge graphs, such as the deposition time order of the interpretation data and the topological spatial location association.

Introducing a knowledge-reasoning algorithm into a knowledge graph to constrain its construction and obtain accurate data samples is necessary. In constructing a topological structure knowledge graph, it is very important to determine the relationship between structural elements. Burns et al. introduced a technique for representing the geological topological relations using a network diagram, in which node entities denote spatial elements and edges indicate the topological connections between them 31 . Based on the above ideas, we used the hierarchical network to obtain the computer representation of the topological structure knowledge graph. The topological structure information fully represented the geometric and structural features of the structural geological model.

Complex geological structure knowledge reasoning

Knowledge reasoning rule base construction.

The pattern layer of a knowledge graph consists of defining entities, relationships, and attributes, as well as building the knowledge graph’s rule base. Structural geology has a long history. Numerous rules and a lot of information regarding the genesis and regularity of movement of geological entities have been formed. Some of these rules can be summarized to form knowledge.

Therefore, according to the definition of a knowledge graph in the construction of a complex-structure knowledge graph, and the schema layer and rule base summary of knowledge constraints are shown in Figs.  3 and 4 .

figure 3

Structural geology knowledge ontology graph.

figure 4

Topological rule base of knowledge reasoning, initial predefined geological structure rule base (geological structure diagram), including six common reasoning modes and subgraph reasoning query diagram; the geometric topology rule base (geometric structure diagram) contains common geometric reasoning patterns.

The basic form of the production rule base in this article is

Among them, \(R\) it represents the number of rules, k is the k rule, which represents the obtained rule conclusion, and the extraction of the knowledge graph data layer is realized through logical rules. Where \(E\) and \(C\) represent the conditions and conclusions of rule reasoning. The pattern layer of a knowledge graph consists of defining entities, relationships, and attributes, as well as building the knowledge graph’s rule base. Structural geology has a long history. Numerous rules and a lot of information regarding the genesis and regularity of movement of geological entities have been formed. Some of these rules can be summarized to form knowledge. Therefore, according to the definition of a knowledge graph in the construction of a complex-structure knowledge graph, the rule base of knowledge reasoning is summarized as shown in Fig.  4 .

In the initial rule base, we first added nine separate geological structure rule patterns as the initial state and then used the geological constraint rule base (prior knowledge) to match the common structural patterns. For parts that match, we inferred the interpretation of the corresponding geological structure to be geologically reasonable. In the construction of model knowledge reasoning, the focus was on geometric topology knowledge graphs from the perspective of computer geometry and knowledge graph reasoning. In the new reasoning process, we focused on two possibilities: (1) different structural superposition and superposition methods since exhausting all geological structure rules is difficult, and (2) predefined rule errors (prior knowledge errors), as shown in Fig.  4 . Experts must determine whether new rules need to be added or whether existing rules are suitable. In the process, the inference rule base is gradually improved and optimized.

The topological structure knowledge graph can be used to represent and describe various geological structural models, including fractures, intrusions, and unconformities. By representing these construction patterns as nodes and using edges to describe the relationships between them, a topology knowledge graph can be constructed. In the knowledge graph, the relationship between different nodes can be expressed as a topological relationship, which can more accurately represent the topological relationship in the geological structure and automatically perform topological inspection and constraints during the modeling process to ensure the correctness of the model.

The geological processes and structural types covered in the rule base are extensive, covering common types of geological structures such as faults, folds, and intrusions and geological structures of different scales, such as large-scale topography and small-scale rock structures. We described the universality of the rule base from three perspectives. (1) The first perspective involves the types of covering tectonic patterns; we listed the types of tectonic patterns contained in the rule base, such as faults, intrusions, and unconformity structures, as shown in Fig.  5 . This rule base can cover the most common construction patterns in graphs. (2) The second perspective is the spatial distribution of cover tectonic models. In addition to the types of cover tectonic models, we considered the distribution of these tectonic models in geological space, which can describe various tectonic models in different geological periods and regions. (3) The third perspective includes the complexity of the cover structure model; the complexities of the geological structure models differ. Certain simple models may be relatively easy to describe and identify, whereas others may require more rules to describe. In the subsequent simulation, the given rules were used to achieve a more complex model (Fig.  5 ). Therefore, our rule base can cover and handle construction patterns with different complexities.

figure 5

Lists common geological structure models, such as faults, intrusions, and unconformity structures. Most of the common geological structures exist in fault structures and unconformity structures. In the proposed rules, the above models can be expressed in the form of a knowledge graph.

Knowledge reasoning for constraints of geological structure modeling

Knowledge reasoning, which can accurately connect to structural modeling, is the main technology used to build a knowledge graph in the field of structural modeling. The aim of constructing a knowledge graph is to obtain expert knowledge from seismic interpretation data and structured and semi-structured data from seismic interpretation data, such as coordinate data of the horizon and fault plane, intersection information of the horizon and fault plane, and the extraction of geological entities, spatial relations, semantic relations, and sedimentary relations. Furthermore, we determine whether the number of rules is finite or countable. Finally, a complete knowledge graph of the structural model was obtained by combining the knowledge reasoning of the spatial relations. In structural modeling, knowledge reasoning can combine spatial relations to achieve a more accurate reasoning process. For example, the relationship between geological entities can be inferred based on their relative positions and intersections in space.

Reasoning constraints in the knowledge graph are realized through the rules of knowledge reasoning. The process of knowledge reasoning is as follows. The specific reasoning process is shown in Fig.  6 . The aim of knowledge reasoning is to make human knowledge comprehensible to computers, construct a geometric topology knowledge graph, and guide 3D geological structure modeling. In our input, the effective horizon data is \(H_{i} = \{ h_{{i_{1} }} ,h_{{i_{m} }} ,....,h_{ia} \}\) , where \(h_{{i_{m} }}\) represents the horizon interpretation data at the m point of the \(i\) section, \(M \le a\) , \(a\) is the number of horizon. And the effective fault data is \(F_{j} = \{ f_{{j_{1} }} ,f_{{j_{n} }} ,...,f_{{j_{b} }} \}\) where the \(f_{{j_{n} }}\) represents the fault interpretation data at the nth point of the j section, 1 \(\le {\text{n}}\le {\text{b}}\) , b is the number of fault. the quantity of a and b is determined according to the specific work area.

figure 6

Complete process of knowledge reasoning can be divided into three parts: transforming geological data into geological constraints that can be read by computer; based on the constructed knowledge base, the knowledge graph is constructed via the knowledge reasoning algorithm. The knowledge graph is represented as a wireframe model to allow expert knowledge to participate in the knowledge graph update.

To facilitate processing, we modeled the input prior knowledge as an adjacency matrix. According to the prior knowledge of geological structure (the intersection information of plane and fault plane) obtained from the perspective of geological experts, the prior information is represented by \(G_{1} = (E,V)\) , where \(E = \{ GeoMetaK,E_{e\_g} \}\) ,

where \(e_{ij} = 1\) indicates that the \(i{\text{-}}th\) face and the j th face have an intersection relationship, \(GeoMetaK = \{ E_{F} ,E{}_{H},V\}\) \(E_{F}\) and \(E_{H}\) represent the fault plane and horizon plane entity respectively.

And \((m + n)(m + n)\) represents the number of intersection relations between horizon planes and fault planes calculated by the bounding box method as our meta-knowledge. \(e_{ij} = 0\) indicates that there is no intersection relationship. And the \(E_{e\_g}\) represents the point, line, surface, block semantic entity that interprets the data generated, and its specific form is \(E_{e\_g} = \{ e_{{p_{1} }} ,e_{{p_{2} }} ,...,e_{{p_{n\_p} }} ,e_{{l_{1} }} ,e_{{l_{2} }} ,...,e_{{l_{n\_l} }} ,e_{{f_{1} }} ,e_{{f_{2} }} ,...,e_{{f_{n\_f} }} ,e_{{l_{1} }} ,e_{{b_{2} }} ,...,e_{{b_{n\_b} }} \}\) where \(n\_p\) , \(n\_l\) , \(n\_f\) and \(n\_b\) represent the number of the topological semantic entities of the generated point, line, surface and block entities respectively. What’s more \(V = \{ V_{meet} ,V_{overlap} ,V_{inside} ,V_{disjoin} ,V_{{{\text{cov}} er}} ,V_{compose} ,V_{equal} ,V_{\lim ot} \}\) . The following tables of the above edges represent the topological position relationships between entities respectively, represents the intersection information of prior knowledge (1 represents intersection, and 0 represents non-intersection) to construct the adjacency matrix of prior knowledge. Through the inference of rule base, the transformation of geological structure information to the topological relationship is realized, and the language (i.e., the form of adjacency matrix) that can be read by a computer is c-ombined to create conditions for the subsequent sub-graph inference. The pseudo-code of knowledge reasoning related to algorithm 1 based on the rule base is as Table 1 .

In the first stage of reasoning, prior human knowledge is combined with data that the computer can recognize, and computer cognition is realized by matching the corresponding rule base.

After obtaining the basic topological entities and relationship information of the knowledge graph, pattern matching in the knowledge base is carried out through sub-graph matching research. The planar entities and block entities existing in the knowledge graph are mined by graph isomorphism matching, and the entities and relationship information conforming to the geological structure are obtained by approximate subgraph matching. We studied the problem of sub-graph matching for knowledge graph. Specifically, given a query graph \(G_{q} = \{ E_{query} ,V_{query} \}\) and data graph \(G_{d} = \{ E_{data} ,V_{data} \}\) the sub-graph matching problem refers to obtaining all the data sub-graphs in \(G_{2}\) that is isomorphic to \(G_{3}\) to determine the new entity information in the knowledge graph. Through the study of each algorithm, we selected the VF2 algorithm with block efficiency and speed. The corresponding pseudo-code is as Table 2 .

Generally, mapping M is expressed as the mapping of the node pair \((e_{query} ,e_{data} )\) ( \(e_{query} \in G_{q}\) and \(e_{data} \in G_{d}\) ) each of which represents the mapping of node \(e_{query}\) of \(G_{q}\) and \(e_{data}\) of \(G_{d}\) , \(M \in G_{d} \times G_{q}\) the function \(F(s,e_{query} ,e_{data} )\) is a feasibility function that simultaneously increases the comparison of node and edge labels.

The return value is a Boolean value used to prune the search tree. At the same time, it can elimiate the situation where the two graphs can be isomorphic but the final matching result cannot be obtained, which is used to reduce the number of state spaces. P (s) represents the set of all pairs of nodes to be matched. The VF2 algorithm is used to match the isomorphism graph, and a knowledge graph (face and block entities) that matches the given rule pattern is obtained. The VF2 algorithm cannot directly query the number of isomorphic sub-graphs; however, this function can be realized by modifying the VF2 algorithm. Specifically, based on the VF2 algorithm, each matched node pair can be marked, and unmatched node pairs can then be searched. Each time a new match is found, the marked node pairs are removed from the current search to query the number of isomorphic sub-graphs.

The specific process uses Algorithm 1 to realize rule matching and converts prior human knowledge into structured data on the computer with the help of prior rules to guide subsequent subgraph isomorphism matching. Through the modified sub-graph matching model, all sub-graphs that are isomorphic to the graph are traversed through sub-graph isomorphism to infer all surface entities and block entity knowledge and construct a complete knowledge graph. The quality of a knowledge graph obtained through knowledge reasoning is typically not guaranteed; therefore, it is added to the knowledge base. Before the process of quality evaluation is required, the wire-frame model is used in construction modeling to evaluate the quality of the knowledge graph. As shown in Fig.  7 , when the constructed knowledge graph does not conform to expert cognition, the error information in the knowledge graph is modified and queried using subgraph matching to identify incorrect subgraph information in the knowledge graph. Quality assessment is the process of measuring and evaluating the credibility of new knowledge before it is added to a knowledge base to eliminate errors or conflicts.

figure 7

The whole knowledge reasoning ultimately participates in the workflow of the construction modeling.

Figure  7 shows the workflow of knowledge reasoning construction modeling constraints participating in construction modeling. Firstly, the knowledge graph is represented as a wire-frame model to determine whether there are errors in the knowledge graph and realize the editing of the knowledge graph data. Secondly, the knowledge graph obtained by knowledge reasoning is a data format that can be recognized by the computer to constrain the boundary conditions of the construction model. Finally, the wire-frame model and the final construction model are consistent with expert knowledge.

In the construction process of knowledge reasoning in the knowledge graph, the entities and relationships in Fig.  8 b,c are obtained through the first stage of knowledge reasoning using prior knowledge (Fig.  8 a) (where prior knowledge is used to judge the intersection relationship between the horizon surface and fault surface by interpreting the data). Here, entities refer to the intersection line and intersection point entities obtained from prior knowledge, as well as the topological semantic relationships between entities. Rule matching in the library was used to realize the reasoning of geological knowledge data to topological structure data, and a preliminary knowledge graph was constructed. In the second stage of knowledge reasoning, the hidden entity and relationship information in the knowledge graph are continuously enriched, and all sub-graphs that are isomorphic to the query sub-graph are obtained through the sub-graph isomorphism query. As shown in Fig.  9 , the hidden surface entity and geological block entity knowledge (third and fourth layers) were obtained by reasoning. The knowledge graph obtained in this study does not contain prior knowledge of the initial definition. It is a purely topological semantic knowledge graph. The intersection point, intersection line, closed surface composed of an intersection line, and closed block entity composed of a surface were mined from prior interpretation data. The knowledge graph obtained by knowledge inference is used as the modeling constraint to estimate the boundary conditions of the constructed model[5]and get the final structural modeling,as show in Fig.  10 .

figure 8

The process of point entity and line entity in the topological geometric knowledge graph obtained by rule base matching in the first stage of knowledge reasoning, which is divided into two steps. ( a ) Is prior knowledge, and geological experts input the intersection relationship between horizon surface and fault surface; ( b ) the topological position relationship between the line entity and the entity is obtained by matching the prior knowledge through the rule base; and ( c ) is the point entity obtained by the rule base matching for the intersection of line entities, and the entities with the same attributes are merged through knowledge fusion to complete the first stage of knowledge reasoning task.

figure 9

Second stage of knowledge reasoning, and the face entity and block entity obtained in the previous stage are used to obtain the subgraph isomorphic to the given graph model by subgraph isomorphism matching, and the matching entity and topological position relationship are recorded, to obtain the updated pure topological space knowledge graph.

figure 10

Closed geological body in the actual work area of three-dimensional geological structure under the constraint of knowledge graph.

Practical application of knowledge reasoning

The constructed geometric constraint knowledge graph can be used to constrain 3D geological structure modeling 5 . The specific construction process is shown in Fig.  11 . Knowledge graph technology is used to construct knowledge graphs in the field of geology so that geologists can query and analyze geological data and identify geological laws and evolution trends. According to the geological knowledge in the knowledge graph, the model was constrained and optimized. The constraint relationship in the knowledge graph is used to specify the topological relationship and geometric properties of the intersection line to improve the accuracy and controllability of the intersection line, provide the geometric constraints required for surface reconstruction, and ensure that the surface is as smooth as possible, while satisfying the geological structural characteristics. The optimized geological model was more consistent with the actual geological conditions.

figure 11

Complete process of knowledge reasoning constrained 3D construction modeling. It can be divided into two parts: knowledge reasoning participates in intersection estimation and three-dimensional geological body surface interpolation reconstruction closed three-dimensional geological body.

In the subsequent modeling process, the topological structure knowledge graph is integrated into the boundary feature extraction, and then the boundary features are used as surface topology semantic constraints, and the morphological features are used as surface geometry semantic constraints, combined with multi-source information abstraction such as stratigraphic interpretation point clouds and well layered data. It is a multi-constraint surface regression model. By building a rectangular grid, based on the spatial autoregressive neural network, the fitting error and smoothing error are used as loss functions to solve the model, and then construct a model with high fitting degree, good smoothness and accurate morphological characteristics.

where \((x_{i}^{{hs_{TL} }} ,y_{i}^{{}} ,z_{i}^{{hs_{TL} }} )\) and \((x_{i}^{{hs_{BL} }} ,y_{i}^{{}} ,z_{i}^{{hs_{BL} }} )\) represents the coordinate set of the end points of the middle level line of the i level plane in the upper and lower wall of the fault respectively. And \((x_{i}^{{fs_{TL} }} ,y_{i}^{{}} ,z_{i}^{{fs_{TL} }} )\) represents the set of coordinates corresponding to \((x_{i}^{{hs_{TL} }} ,y_{i}^{{}} ,z_{i}^{{hs_{TL} }} )\) in the fault line of the fault plane. In the same way, the \((x_{i}^{{fs_{BL} }} ,y_{i}^{{}} ,z_{i}^{{fs_{BL} }} )\) represents the set of coordinates corresponding to \((x_{i}^{{hs_{BL} }} ,y_{i}^{{}} ,z_{i}^{{hs_{BL} }} )\) in the fault line of the fault plane. \(\mu_{i}^{{hs_{TL} }}\) and \(\sigma_{i}^{{hs_{TL} }}\) represent the mean and standard deviation of the dip angle near the end point \((x_{i}^{{hs_{TL} }} ,y_{i}^{{}} ,z_{i}^{{hs_{TL} }} )\) of the hanging wall horizon line. \(\mu_{i}^{{hs_{BL} }}\) and \(\sigma_{i}^{{hs_{BL} }}\) represent the mean and standard deviation of the dip angle near the end point \((x_{i}^{{hs_{BL} }} ,y_{i}^{{}} ,z_{i}^{{hs_{BL} }} )\) of the hanging wall horizon line, \(\mu_{i}^{{fs_{TL} }}\) , \(\sigma_{i}^{{fs_{TL} }}\) and \(\mu_{i}^{{fs_{BL} }}\) , \(\sigma_{i}^{{fs_{BL} }}\) represent the mean and standard deviation of the reciprocal fault dip angle near fault points \((x_{i}^{{fs_{TL} }} ,y_{i}^{{}} ,z_{i}^{{fs_{TL} }} )\) and \((x_{i}^{{fs_{TL} }} ,y_{i}^{{}} ,z_{i}^{{fs_{TL} }} )\) .

By comparison in Fig.  12 , it is found that in the traditional extrapolation interpolation method, the intersection lines of the second horizon and the first and third horizons are staggered. At the same time, considering that the fault is a reverse fault, the hanging-footwall intersection line is required to be located above the hanging-footwall intersection line. In the traditional extrapolation interpolation method, the intersecting lines of the upper and lower disks of the first horizon also appear interleaved. Under the guidance of the principle of knowledge graph, the method in this paper can expertly extract reliable boundary feature lines that are consistent with geological laws and expert cognition. Ensure that the intersection line of the deposited lower horizon always remains below the intersection line of the deposited upper horizon. Moreover, the hanging-wall (thrust fault) or hanging-wall (normal fault) intersection line always remains above the hanging-wall (thrust fault) or hanging-wall (normal fault) intersection line. This method effectively prevents the occurrence of intersecting lines of adjacent layers or intersecting lines of upper and lower layers, and eliminates any unreasonable phenomenon.

figure 12

Comparison of boundary feature extraction methods. ( a ) Boundary characteristics guided by knowledge graph; ( b ) boundary characteristics of extrapolation interpolation.

To test the effectiveness of the complex geological knowledge graph based on knowledge reasoning, we constructed a three-dimensional geological model of the study area in Sichuan. When the traditional structural modeling method (Fig.  13 ) encounters an unreasonable fault or horizon, it obtains an accurate geological structural model by modifying the original data (Fig.  14 ). This method requires modification of a large amount of raw data, which is obviously not suitable for more complex work areas. But the method based on the knowledge reasoning (Fig.  10 ) can accurately constrain the three-dimensional geological modeling and reduce the uncertainty of structural modeling, what’s more, this method modifies the knowledge to model the constraints and avoids the uncertainty caused by modifying the original data.

figure 13

Three-dimensional geological structure (error).

figure 14

Three-dimensional geological structure.

Other knowledge reasoning approaches

In addition, certain neural network-based methods, such as the graph neural network (GNN) 32 , 33 , can be used for subgraph isomorphism reasoning. These methods use the learning ability of neural networks to query whether a corresponding subgraph exists by learning the feature vectors of nodes and edges and describing the reasoning problem of the graph in detail.

Owing to the high requirements of domain knowledge graphs, ensuring the accuracy of results using a deep learning method is difficult if applied to practical engineering. The existing method constructs a model rule base by human definition to realize the construction of a knowledge graph; however, following the development of deep learning, the rules existing in the construction of knowledge graphs can be automatically learned, efficient knowledge reasoning can be realized, and the rule base can be avoided. For example, the hidden subface and closed block entities in a knowledge graph can be obtained by reasoning based on a random walk, and a topological semantic knowledge graph can be obtained. The AMIE algorithm proposed by Galárraga et al. is an association rule mining algorithm based on an incomplete knowledge base 34 . Each rule is predicted by learning: for each relationship, starting from the rule whose body is empty, the body part of the rule is expanded by three operations, and the candidate rules whose knowledge degree exceeds the threshold are retained to realize association rule mining research of knowledge graphs.


The method in this article aims to mine the topological structure of existing models from interpretation data. There is no better method for constructing topological semantic knowledge graphs for other data (such as seismic data); the limitation of this method is that it relies on Expert experience realizes the improvement of the knowledge reasoning rule base. For the limited work area data, our geological rule base can only build a knowledge graph and constrain the three-dimensional structural modeling for the existing work area. For other work areas, our method is theoretically applicable. However, for areas that are not accurate enough, we need to continuously add rules for reasoning to further improve our rule base. Based on a large number of experiments, our method is theoretically effective. Scalable. The disadvantage is that it relies on the participation of a large number of expert experience in the early stage, but in the actual work area, the accuracy of this method is relatively high and can meet the needs of actual production.


This study introduces a technique for knowledge reasoning in the field of structural modeling, which can be applied to create modeling constraints for 3D geological structure modeling in the context of oil exploration. The knowledge graph we constructed enables experts and users to access the semantic information contained in the model at any time and maintain expert knowledge throughout the modeling process. Geological experts can visualize the knowledge graph more easily and find connections between knowledge items. The knowledge graph is used to constrain the boundary information of complex structural models, improving the efficiency of intelligent structural modeling.

Data availability

The datasets generated and/or analysed during the current study are not publicly available due [REASON WHY DATA ARE NOT PUBLIC] but are available from the corresponding author on reasonable request.

Perrin, M. & Rainaud, J.-F. Shared earth modeling: knowledge driven solutions for building and managing subsurface 3D geological models. Ed. Technip 283 , 76–112 (2013).

Google Scholar  

Caumon, G., Pellerin, J. & Laurent, G. Current bottlenecks in geomodeling workflows and ways forward. Can. Soc. Petrol. Geol. 37 , 32–64 (2013).

Bentler, P. M. & Chou, C.-P. Practical issues in structural modeling. Sociol. Methods Res. 16 , 78–117 (1987).

Article   Google Scholar  

Guo, J., Wang, X. & Wang, J. Three-dimensional geological modeling and spatial analysis from geotechnical borehole data using an implicit surface and marching tetrahedra algorithm. Eng. Geol. 284 , 106047 (2021).

Zhang, B., Tong, Y. & Du, J. Three-dimensional structural modeling (3D SM) and joint geophysical characterization (JGC) of hydrocarbon reservoir. Minerals 12 , 363 (2022).

Article   ADS   CAS   Google Scholar  

Radwan, A. E. Three-dimensional gas property geological modeling and simulation. In Sustainable Geoscience for Natural Gas Subsurface Systems (ed. Radwan, A. E.) 29–49 (Elsevier, 2022).

Chapter   Google Scholar  

Puzyrev, V., Salles, T. & Surma, G. Geophysical model generation with generative adversarial networks. Geosci. Lett. 9 , 1–9 (2022).

Yu, P., Dempsey, D. & Rinaldi, A. P. Association between injection and microseismicity in geothermal fields with multiple wells: Data-driven modeling of Rotokawa, New Zealand, and Húsmúli, Iceland. J. Geophys. Res. Solid Earth 128 , e2022JB025952 (2023).

Article   ADS   Google Scholar  

Peng, H., Dukalski, M. & Elison, P. Data-driven suppression of short-period multiples from laterally varying thin-layered overburden structures. Geophysics 88 , V59–V73 (2023).

Madsen, R. B., Høyer, A.-S. & Andersen, L. T. Geology-driven modeling: A new probabilistic approach for incorporating uncertain geological interpretations in 3D geological modeling. Eng. Geol. 309 , 106833 (2022).

Hasan, M. & Shang, Y. Geophysical evaluation of geological model uncertainty for infrastructure design and groundwater assessments. Eng. Geol. 299 , 106560 (2022).

Calcagno, P., Chilès, J.-P. & Courrioux, G. Geological modelling from field data and geological knowledge: Part I. Modelling method coupling 3D potential-field interpolation and geological rules. Phys. Earth Planet. Inter. 171 , 147–157 (2008).

Arantes, A. & Ferreira, L. M. D. Development of delay mitigation measures in construction projects: A combined interpretative structural modeling and MICMAC analysis approach. Prod. Plann. Control 1 , 1–16 (2023).

Zhan, X., Lu, C. & Hu, G. 3D structural modeling for seismic exploration based on knowledge graphs. Geophysics 87 , 81–100 (2022).

Steiner, T., Verborgh, R. & Troncy, R. Adding realtime coverage to the google knowledge graph. In 11th International Semantic Web Conference (ISWC 2012) 65–68 (Citeseer, 2012).

Alberti, B. Archaeologies of ontology. Annu. Rev. Anthropol. 45 , 163–179 (2016).

Lee, J. & Zlatanova, S. A 3D data model and topological analyses for emergency response in urban areas. Geospat. Inf. Technol. Emerg. Response 1 , 159–184 (2008).

Gray, J. & Rumpe, B. Explicit versus implicit models: What are good languages for modeling? Softw. Syst. Model. 21 , 839–841 (2022).

Guo, J., Wang, J. & Wu, L. Explicit–implicit-integrated 3-D geological modelling approach: A case study of the Xianyan Demolition Volcano (Fujian, China). Tectonophysics 795 , 228648 (2020).

Bourne, S., Oates, S. J. & Elk, J. F. V. A seismological model for earthquakes induced by fluid extraction from a subsurface reservoir. J. Geophys. Res. Solid Earth 119 , 8991–9015 (2014).

Manzocchi, T., Carter, J. N. & Skorstad, A. Sensitivity of the impact of geological uncertainty on production from faulted and unfaulted shallow-marine oil reservoirs: Objectives and methods. Petrol. Geosci. 14 , 3–15 (2008).

Article   CAS   Google Scholar  

Oliver, M. A. & Webster, R. Kriging: A method of interpolation for geographical information systems. Int. J. Geogr. Inf. Sci. 4 , 313–332 (1990).

Ming, J., Pan, M. & Qu, H. GSIS: A 3D geological multi-body modeling system from netty cross-sections with topology. Comput. Geosci. 36 , 756–767 (2010).

Guo, J., Wu, L. & Zhou, W. Section-constrained local geological interface dynamic updating method based on the HRBF surface. J. Struct. Geol. 107 , 64–72 (2018).

González-Garcia, J. & Jessell, M. A 3D geological model for the Ruiz-Tolima Volcanic Massif (Colombia): Assessment of geological uncertainty using a stochastic approach based on Bézier curve design. Tectonophysics 687 , 139–157 (2016).

Mallet, J. L. Discrete smooth interpolation in geometric modelling. Comput. Aided Des. 24 , 178–191 (1992).

Yan, B., Janowicz, K. & Mai, G. A spatially explicit reinforcement learning model for geographic knowledge graph summarization. Trans. GIS 23 , 620–640 (2019).

Hogan, A., Blomqvist, E. & Cochez, M. Knowledge graphs. ACM Comput. Surv. 54 , 1–37 (2021).

Hao, X., Ji, Z. & Li, X. Construction and application of a knowledge graph. Remote Sens. 13 , 2511 (2021).

Zhan, X., Lu, C. & Hu, G. Event sequence interpretation of structural geological models: A knowledge-based approach. Earth Sci. Inform. 14 , 99–118 (2021).

Burns, K. L. Retrieval of Tectonic Process Models from Geologic Maps and Diagrams (1981).

Krleža, D. & Fertalj, K. Graph matching using hierarchical fuzzy graph neural networks. IEEE Trans. Fuzzy Syst. 25 , 892–904 (2016).

Liu, X., Pan, H. & He, M. Neural subgraph isomorphism counting. In Proc. 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining 1959–1969 (2020).

Galárraga, L. A., Teflioudi, C. & Hose, K. AMIE: Association rule mining under incomplete evidence in ontological knowledge bases. In Proc. 22nd International Conference on World Wide Web 413–422 (2013).

Download references


In this section, you can acknowledge any support given which is not covered by the author contribution or funding sections. This may include administrative and technical support, or donations in kind (e.g., materials used for experiments).

This research was funded by National Natural Science Foundation of China, Grant Number 41974147.

Author information

Authors and affiliations.

School of Information and Communication Engineering, University of Electronic and Science Technology of China, Chengdu, China

Cai Lu & Bingbin Zhang

School of Resources and Environment, University of Electronic and Science Technology of China, Chengdu, China

You can also search for this author in PubMed   Google Scholar


The following statements should be used “Conceptualization, X.X. and C.L.; methodology, X.X.; software, C.L.; validation, C.L., X.X. and B.Z.; formal analysis, X.X.; investigation, C.L.; resources, X.X.; data curation, X.X.; writing—original draft preparation, C.L.; writing—review and editing, C.L.; visualization, C.L.; supervision, C.L.; project administration, C.L.; funding acquisition, X.X. All authors have read and agreed to the published version of the manuscript”.

Corresponding author

Correspondence to Xinran Xu .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Lu, C., Xu, X. & Zhang, B. Construction of knowledge constraints: a case study of 3D structural modeling. Sci Rep 14 , 4704 (2024). https://doi.org/10.1038/s41598-024-55115-4

Download citation

Received : 18 December 2023

Accepted : 20 February 2024

Published : 27 February 2024

DOI : https://doi.org/10.1038/s41598-024-55115-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Knowledge reasoning
  • Interpretation
  • Complex geological structures

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

netflix technology case study

  • Newsletters

The weird way Alabama’s embryo ruling takes on artificial wombs

A state supreme court has shocked fertility clinics by ruling that lab embryos are children.

  • Antonio Regalado archive page

Specialist Embryologist Takes a capsule With Embryos from the Cryobank

This article first appeared in The Checkup, MIT Technology Review ’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here .

A ruling by the Alabama Supreme Court last week that frozen embryos stored in labs count as children is sending “ shock waves ” through the fertility industry and stoking fears that in vitro fertilization is getting swept up into the abortion debate.

The New York Times reports that one clinic, at the University of Alabama, has stopped fertilizing eggs in its laboratory, fearing potential criminal prosecution .

Fertility centers create millions of embryos a year. Some are frozen and others used in research, but most are intended to be transplanted into patients’ wombs so they can get pregnant. 

The Alabama legal ruling is clearly animated by religion—there are lots of Bible quotes and references to “murder” when discussing abortion. But what hasn’t gotten as much notice is the court’s specific argument that an embryo is a child “regardless of its location.” This could have implications for future technologies in development, such as artificial wombs or synthetic embryos made from stem cells. 

The case arose from an incident at an Alabama IVF clinic, the Center for Reproductive Medicine, in which a patient wandered into a storage area and removed a container of embryos from liquid nitrogen. 

That’s when “the subzero temperatures at which the embryos had been stored freeze-burned the patient’s hand, causing the patient to drop the embryos on the floor,” the decision recounts. The embryos, consisting of just a few cells, thawed out and died.

Angered by the mishap, some families then tried to collect financial damages. They sued under Alabama’s Wrongful Death of a Minor statute, which was first written in 1872, long before test-tube babies.

The question the court felt it had to decide: Do frozen embryos count as minor children or not? 

The defendants argued, in part, that an IVF embryo can’t be a child or a person because it’s not yet in a biological womb. No womb, no baby, no birth, and no child. And this is where things start to get interesting and spiral into science fiction territory. 

Justice Jay Mitchell, writing for the majority, pounced on what he called the “latent implication” of the defense’s argument. What about a baby growing to term an artificial womb? Would it also not count as a person, he asked, just because it’s not “in utero”?

According to their ruling , the wrongful-death act “applies to all unborn children, regardless of their location,” and “no exception” can be made for embryos regardless of their age, even if they’ve been in deep freeze for a decade. Nor does the law exclude any type of “extrauterine children” science can conceive.

It’s common for judges to wrestle with complex questions as they try to apply old laws to new technology. But what’s so unusual about this decision is that the judges ended up giving opinions on technology that hasn’t been fully invented.

“I think the opinion is really extraordinary,” says Susan Wolf, a professor of law and medicine at the University of Minnesota. “I can’t think of another case where a court powered its ruling by looking not only at technology not actually before the court, but number two, that doesn’t exist in human beings. They can’t make a binding decision about future technology that is not even part of the case.” 

Bad law or not, the question the Alabama justices ruled on could soon be a real one. Several companies are actually developing artificial wombs to keep very premature infants alive, and other research labs are working with fluid-filled bottles in which they’ve grown mouse embryos until they are fetuses with beating hearts. 

One startup company in Israel, Renewal Bio, says it wants to grow synthetic human embryos (the kind formed by stem cells) until they are 40 days old, or more, in order to collect their tissue for transplant medicine. 

All this technology is racing along, so the question of the moral and legal rights of incubated human fetuses might not be hypothetical for very long. 

Among the dilemmas lawyers and doctors could face: If a fetus is growing in a tank, would a decision to shut off its support systems be protected under liberal states’ abortion laws, which are typically based on the rights of a pregnant person? Would a fetus engineered solely to grow organs, lacking a brain cortex and without sentience, also still be considered a child in Alabama?

So while it’s obvious that the Alabama decision reflects the justices’ religious views rather than science, and that it could hurt people who just want to have a baby, maybe it is time to think about what the court calls the “many difficult questions” the wrongful-death case has raised about “the ethical status of extrauterine children.”

Now read the rest of The Checkup

For the first time, you can easily order GMOs to plant at home . The biotech plants on sale include a bright-purple tomato and a petunia plant that glows in the dark. ( MIT Technology Review )

From MIT Technology Review ’s archives

Last fall, my colleague Cassandra Willyard told us everything we need to know about artificial wombs . The experimental devices, she explained, are being developed to give premature babies more time to develop. So far, they’ve been tested on lambs, but human studies are being planned.

Another kind of artificial womb is used to keep very early embryos developing longer in the lab. A startup based in Israel called Renewal Bio says it hopes to grow “synthetic” human embryos this way longer than ever before as a way of bio-printing organs. 

After the US Supreme Court overturned abortion protections in 2022, several American states moved to ban the practice. Anticipating that people may seek abortions anyway, we explained how to end a pregnancy with pills ordered from an online pharmacy. 

Around the web

Elon Musk announced on X that the first volunteer to receive a brain implant from his company Neuralink can control a computer with it and can “move a mouse around the screen just by thinking.” Some commentators are annoyed at Musk for grabbing publicity while revealing few details about the study. ( Wired ) 

Biotechnology and health

Scientists are finding signals of long covid in blood. they could lead to new treatments..

Faults in a certain part of the immune system might be at the root of some long covid cases, new research suggests.

  • Cassandra Willyard archive page

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

The first gene-editing treatment: 10 Breakthrough Technologies 2024

Sickle-cell disease is the first illness to be beaten by CRISPR, but the new treatment comes with an expected price tag of $2 to $3 million.

We’ve never understood how hunger works. That might be about to change.

Scientists have spent decades trying to unravel the intricate mysteries of the human appetite. Are they on the verge of finally determining how this basic drive functions?

  • Adam Piore archive page

Stay connected

Get the latest updates from mit technology review.

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at [email protected] with a list of newsletters you’d like to receive.


Science and Technology Studies

STS offers degrees at each university level: undergraduate, masters, and PhD

  • Warning Research Centre


Greater Manchester: A Local Case Study in Building City Resilience

06 March 2024, 2:00 pm–3:30 pm

Headshot of Dr Kathy Oldham OBE

Dr Kathy Oldham OBE (Chief Resilience Officer, Greater Manchester, UK) joins the WRC for a guest talk on building city resilience

This event is free.

Event Information


Urban resilience is arguably a journey rather than a destination and the Greater Manchester city-region has been taking action to strengthen its resilience for over a decade. In her presentation Kathy will describe some of the pivotal moments in Greater Manchester’s work, including forging partnerships within the city-region, embedding resilience across different policy agendas, and collaborating with cities across the world. She will also reference some of the key emergencies that have shaped Greater Manchester’s approaches to resilience including flooding of 9 of its 10 Boroughs on Boxing Day 2015, the impact of winter storms trapping the equivalent of a small town on the highest motorway in England, and the Manchester Arena attack. Greater Manchester has been selected as one of only 8 areas in the country to work with Government on new ways to strengthen urban resilience and Kathy will conclude by describing the plans for the next chapter in the city region’s resilience journey.

Hybrid Event Information:

The event will take place on UCL's campus, however the event will be hybrid, so do select an online-only ticket at checkout to join us, wherever you are in the world! Zoom links for registered attendees will be sent in good time prior to the event to the email address used when booking an online ticket.

About the Speaker

Dr kathy oldham obe.

Chief Resilience Officer at Greater Manchester, UK

Kathy heads up a specialist unit within the Greater Manchester Combined Authority delivering disaster risk reduction and emergency response services for Greater Manchester’s ten local authorities. She has led the development of Greater Manchester’s Resilience Strategy and provides a strategic advice function to the multi-agency Greater Manchester Resilience Forum partnership. Kathy recently led a successful bid for Greater Manchester to become a pilot in the national Stronger LRFs Programme designed to test ways of implementing the UK Government Resilience Framework.  She also leads Greater Manchester’s participation in the international Resilient Cities Network, UNDRR’s Making Cities Resilient 2030 (MCR2030) initiative, within which Greater Manchester has been recognised as a Resilience Hub, and the Counter Terrorism Preparedness Network (CTPN). She has been appointed as a Commissioner for the National Preparedness Commission. With over 15 years’ experience in resilience, Kathryn has been engaged in drafting national, British, European and International Standards on resilience. Kathy holds a medical degree and has previously held a wide range of positions in local government.


  1. How Netflix Became A Master of DevOps? An Exclusive Case Study

    netflix technology case study

  2. Conception du système Netflix

    netflix technology case study

  3. Netflix Case Study (Concept) on Behance

    netflix technology case study

  4. Netflix Case Study

    netflix technology case study

  5. Use of Analytics by Netflix

    netflix technology case study

  6. Netflix case study

    netflix technology case study


  1. PDF Strategic Innovation Management at Netflix: A Case Study

    Strategic Innovation Management at Netflix: A Case Study Ingrid Souza1 and Fernando Romero1,2 1Department of Production and Systems, University of Minho, Guimarães, Portugal ... AND ("innovation" OR "netflix" OR "technology business")] and were searched among the Title, Abstract and Keywords for the time span from 2011 to 2021.

  2. How Netflix Faced A Digital Transformation: A Case Study

    Netflix serves as the ultimate digital transformation case study. They transformed their entire business model and charted unprecedented waters. Here's how to use their model as inspiration for your contact center's digital transformation. How to move your operations to the cloud, Netflix style: A digital transformation case study.

  3. AWS Innovator: Netflix

    Read the case study Global Production ft. Netflix 2021 In this keynote presentation at SIGGRAPH 2021, Laura Teclamariam, director of product and animation at Netflix, and Rahul Dani, director of studio engineering at Netflix, discuss how content production today is truly global and what that means for storytellers. Watch the video

  4. Case Study: How Netflix uses Cloud for Innovation, Agility and

    (Source: Netflix, Amazon Case Study on Netflix). The existing IaaS delivery is executed using the consoles of cloud providers, allowing a faster release of new features for users. ... Cloud computing is a technology designed to help organizations obtain the first-mover advantage, as evident from their rich variety of service offerings.

  5. Netflix's Bold Disruptive Innovation

    Adam Richardson September 20, 2011 Every now and then, the business world presents us with a lab experiment that we can observe in realtime. Netflix's announcement that it is splitting off its...

  6. PDF Disruptive Innovation: a Case Study on How Netflix Is ...

    The research utilizes a case study approach, using Netflix as the case company. Due to technological advancements the TV and entertainment content industry has drastically changed with new methods of consuming content, and new business models to disrupt the market. Having disrupted the market, Netflix remains a leading force among consumers.

  7. Netflix's Growth Alongside Digital Transformation

    This success has come from primarily three reasons: 1) advancements in streaming capabilities accelerated in line or better than expected as Netflix transitioned to a primarily streaming service; 2) the proliferation of mobile phones and tablets as well as the introduction of smart televisions allowed Netflix to be available to its customers at ...

  8. Netflix & Big Data: The Strategic Ambivalence of an Entertainment

    Netflix actively fueled what is known as the myth of big data, promoting their recommender system and data-driven production as cutting-edge, all-seeing, and all-knowing. Today, however, the company is increasingly acknowledging the role of human expertise and creativity. In this paper I explore the strategic repositioning of Netflix from ...

  9. Artificial Intelligence at Netflix

    Machine learning Recommendations Experimentation and causal inference Analytics Encoding and quality Computer vision In this article, we'll look at how Netflix has explored AI applications for its business and industry through two unique use-cases:

  10. The Power of Digitalization: The Netflix Story

    First Online: 18 May 2020 4254 Accesses 2 Citations 8 Altmetric Part of the Advances in Intelligent Systems and Computing book series (AISC,volume 1161) Abstract The evolution of technology, and mainly the evolution of the Internet, has improved the way business is done.

  11. Netflix Recommender System

    From (Netflix Technology Blog, 2017c), offline computation is applied to data and it is not concerned with real-time analytics at the user. Execution time is relaxed, and the algorithm is trained in batches without any pressure on the amount of data to be processed in a fixed time interval. ... Case Study: How Netflix Uses AI to Personalize ...

  12. Netflix Case Study

    Netflix Case Study. Online content provider Netflix can support seamless global service by using Amazon Web Services (AWS). AWS enables Netflix to quickly deploy thousands of servers and terabytes of storage within minutes. Users can stream Netflix shows and movies from anywhere in the world, including on the web, on tablets, or on mobile ...

  13. Disruptive Innovation: A Case Study on How Netflix is Transforming the

    The research utilizes a case study approach, using Netflix as the case company. Due to technological advancements the TV and entertainment content industry has drastically changed with new methods of consuming content, and new business models to disrupt the market. Having disrupted the market, Netflix remains a leading force among consumers. ...

  14. Netflix: A Case of Transformation for Video Streaming Service

    2. Hybrid App Netflix is a hybrid app, which gives it an edge over native apps. A hybrid app comes with features like easy code portability, low maintenance cost, and fast development speed. As the technology related to hybrid apps keeps on evolving, the streaming service can update its functionalities seamlessly.

  15. Elevating Entertainment: A Case Study on Netflix's User Experience

    By seamlessly blending technology and entertainment, Netflix has set a benchmark for the industry, illustrating how a user-centric approach can redefine the way we engage with digital platforms. As the streaming landscape continues to evolve, Netflix's user experience remains a case study in effective design and customer satisfaction.

  16. How Netflix Became A Master of DevOps? An Exclusive Case Study

    This case study explores how Netflix implemented DevOps by drawing inspiration from its principles and focusing on a collaborative culture that prizes innovation. Even though Netflix is an entertainment company, it has left many top tech companies behind in terms of tech innovation. With its single video-streaming application, Netflix has ...

  17. Netflix Case Study: Unveiling Data-Driven Strategies for Streaming

    Netflix Case Study (EDA): Unveiling Data-Driven Strategies for Streaming Swapnil Vishwakarma 14 Jun, 2023 • 16 min read Introduction Welcome to our comprehensive data analysis blog that delves deep into the world of Netflix. As one of the leading streaming platforms globally, Netflix has revolutionized how we consume entertainment.

  18. Netflix: A Case Study Customer Focus, Innovation, Global ...

    Published Oct 11, 2023 + Follow Netflix's success as a global streaming giant with over 220 million subscribers across 190 countries offers valuable lessons and actionable insights for CxOs....

  19. An Analysis of Netflix's Business Strategy and How the Company is

    Abstract Netflix, the pioneer of streaming service, is noted for its game-changing strategies that has not only set the foundation of Over-the-Top (OTT) services but also introduced the major...

  20. Netflix

    9 min read · Oct 2, 2023 Photo by freestocks on Unsplash The arrival of the digital age sparked a new era in the entertainment sector. The conventional paradigms of information delivery and...

  21. Case Study: How Netflix Uses AI to Personalize Content ...

    Here are a few tips for businesses from the Netflix case study: Collect data on your customers' actions and preferences. You can use this data to tailor your marketing messages, suggest products ...

  22. Netflix Case Study: Analysis

    But in 2016, Netflix made a come back and accumulated $8.83 billion in revenue. One would wonder how come a company which was worth $50 million in 2000 is now worth around $87 billion. A layman ...

  23. A Case Study on Netflix Marketing Strategy

    A Case Study on Netflix Marketing Strategy By Rahul Arun Last updated on Oct 30, 2023 107938 Netflix was founded in 1997, offering online movie rentals with less than 1000 titles. Soon, it switched to the subscriber-based model, and in 2000 Netflix introduced a personalized movie recommendation system.

  24. Research on the Manufacturing Process and Restoration of a Brigandine

    With the Ming brigandine in Hangzhou Arts and Crafts Museum as an example, the author explored the manufacturing process of the Ming brigandine and conducted a restoration study of it, from which the superiority of the brigandine's manufacturing process and the scale production of Ming armour were summarised and discussed.

  25. Construction of knowledge constraints: a case study of 3D ...

    This study makes the following contributions: (1) We propose a framework and process to obtain modeling constraint information by knowledge inference of complex geological structure models (Fig. 1 ...

  26. The weird way Alabama's embryo ruling takes on artificial wombs

    The case arose from an incident at an Alabama IVF clinic, the Center for Reproductive Medicine, in which a patient wandered into a storage area and removed a container of embryos from liquid ...

  27. Greater Manchester: A Local Case Study in Building City Resilience

    Science and Technology Studies; ... Greater Manchester: A Local Case Study in Building City Resilience. 06 March 2024, 2:00 pm-3:30 pm . Dr Kathy Oldham OBE (Chief Resilience Officer, Greater Manchester, UK) joins the WRC for a guest talk on building city resilience.