The document discusses various use cases for Azure Cosmos DB including handling peak sales periods with elastic scaling, delivering real-time recommendations, leveraging IoT telemetry to build experiences, delivering high-quality app experiences globally at scale, and modernizing and building new apps with real-time personalization. It provides examples of companies like Walmart Labs, ASOS, and The Walking Dead game using Cosmos DB for these scenarios. The document also discusses migrating NoSQL workloads from databases like MongoDB, Cassandra, and DynamoDB to Azure Cosmos DB and provides an example of Symantec migrating Cassandra workloads.
From Insights to Action, How to build and maintain a Data Driven Organization...Amazon Web Services Korea
데이터는 혁신과 변혁의 토대입니다. 비즈니스 혁신을 이끄는 혁신은 특정 시점의 전략이나 솔루션이 아니라 성장을 위한 반복적이고 집단적인 계획입니다. 혁신에 이러한 접근 방식을 채택하는 기업은 전략과 비즈니스 문화에서 데이터를 기반으로 하는 경우가 많습니다. 이러한 접근 방식을 개발하려면 리더가 데이터를 조직의 자산처럼 취급하고 조직이 더 나은 비즈니스 성과를 위해 데이터를 활용할 수 있도록 권한을 부여해야 합니다. AWS와 Amazon이 어떻게 데이터와 분석을 활용하여 확장 가능한 비즈니스 효율성을 창출하고 고객의 가장 복잡한 문제를 해결하는 메커니즘을 개발했는지 알아보십시오.
The document discusses GE's Industrial Data Lake Platform. It notes that industrial data is growing rapidly in terms of both volume and variety. However, most industrial data is not analyzed due to challenges in gathering, preparing, and analyzing the data. GE's Industrial Data Lake is presented as a solution to address these challenges. It provides a single place to access both real-time and historical industrial data of all types. It also allows for more flexible and agile data models compared to traditional data warehouses. The data lake is optimized for industrial workloads and includes features like fast data ingestion, high performance analytics, and data governance capabilities.
What is Microsoft Azure?
What is Azure used for?
Why do businesses want to use someone else's hardware?
What are the advantages of virtualization?
Is Azure secure?
How does Azure stack up against the competition?
To help you make an informed decision about whether Azure is right for your business.
This one-hour presentation covers the tools and techniques for migrating SQL Server databases and data to Azure SQL DB or SQL Server on VM. Includes SSMA, DMA, DMS, and more.
This document provides an overview of Microsoft Azure including what Azure is, the platform services it offers, licensing and purchasing options, estimating costs, and resources for getting started with Azure. Azure is an on-demand cloud computing platform that provides infrastructure and platform services. It offers computing, networking, databases, analytics, mobile, IoT and enterprise application services. Customers can purchase Azure services through pay-as-you-go, commitment plans, or open licensing programs. The document recommends starting points for learning Azure and provides additional resources.
The document provides an overview of the Databricks platform, which offers a unified environment for data engineering, analytics, and AI. It describes how Databricks addresses the complexity of managing data across siloed systems by providing a single "data lakehouse" platform where all data and analytics workloads can be run. Key features highlighted include Delta Lake for ACID transactions on data lakes, auto loader for streaming data ingestion, notebooks for interactive coding, and governance tools to securely share and catalog data and models.
The Azure Migration Program provides a step-by-step approach to migrate workloads to Azure over time. It offers prescriptive guidance, tools, skill building, and incentives to accelerate customers' journey to the cloud. Customers first assess their environments and plan migrations. They then build the foundation and complete skill building. With assistance from Microsoft and partners, customers execute migrations, optimize workloads, and establish management and security practices on Azure.
Mundo gamer e a oportunidade de entrada pela abordagem do movimentoaccenture
O documento discute a relação entre jogos eletrônicos e atividades físicas, analisando o mercado de games, a evolução histórica da inclusão de movimentos nos jogos e os perfis de jogadores brasileiros. Ele apresenta mapeamentos de stakeholders no mercado de games e sua relação com jogadores, além de insights sobre hábitos de exercício e preferências de marcas entre gamers.
This document discusses new analysis skills required for working with big data technologies like NoSQL databases. It provides examples of popular open source NoSQL databases like Cassandra, HBase, MongoDB and Couchbase. It also classifies NoSQL databases into categories like column-oriented, document, key-value and graph databases. The document then discusses how big data solves problems related to volume, velocity, variety and value of data. It provides examples of sources of big data and trends shaping interest in big data. Finally, it discusses use cases of big data in retail and finance industries.
Prudential has invested millions of dollars in a digital innovation program introducing new business models: Hybrid Digital Agency, Alternative Distribution Channels, and a Wellness Ecosystem. The goals are to grow sales, agent productivity, and customer Net Promoter Score. Prudential is executing on these models across Latin America using agile methodology, mobile platforms, and a modular architecture. Key aspects of the models include end-to-end digital sales, a wellness platform incentivizing healthy behaviors, and partnerships to expand distribution. Early results show increases in engagement, goal completion, and positive customer feedback.
O documento discute a segurança na nuvem fornecida pela AWS. Ele destaca que a AWS construiu sua infraestrutura com altos padrões de segurança e está de acordo com vários padrões de segurança e conformidade. Além disso, o documento discute como as empresas podem implementar arquiteturas híbridas com segurança compartilhada entre a infraestrutura própria e a AWS.
Best Practices in DataOps: How to Create Agile, Automated Data PipelinesEric Kavanagh
Synthesis Webcast with Eric Kavanagh and Tamr
DataOps is an emerging set of practices, processes, and technologies for building and automating data pipelines to meet business needs quickly. As these pipelines become more complex and development teams grow in size, organizations need better collaboration and development processes to govern the flow of data and code from one step of the data lifecycle to the next – from data ingestion and transformation to analysis and reporting.
DataOps is not something that can be implemented all at once or in a short period of time. DataOps is a journey that requires a cultural shift. DataOps teams continuously search for new ways to cut waste, streamline steps, automate processes, increase output, and get it right the first time. The goal is to increase agility and cycle times, while reducing data defects, giving developers and business users greater confidence in data analytic output.
This webcast examines how organizations adopt DataOps practices in the field. It will review results of an Eckerson Group survey that sheds light on the rate and scope of DataOps adoption. It will also describe case studies of organizations that have successfully implemented DataOps practices, the challenges they have encountered and benefits they’ve received.
Tune into our webcast to learn:
- User perceptions of DataOps
- The rate of DataOps adoption by industry and other demographic variables
- DataOps adoption by technique and component (i.e., agile, test automation, orchestration, continuous development/continuous integration)
- Key challenges organizations face with DataOps
- Key benefits organizations experience with DataOps
- Best practices in doing DataOps
- Case studies and anecdotes of DataOps at companies
The document discusses Azure Data Factory and its capabilities for cloud-first data integration and transformation. ADF allows orchestrating data movement and transforming data at scale across hybrid and multi-cloud environments using a visual, code-free interface. It provides serverless scalability without infrastructure to manage along with capabilities for lifting and running SQL Server Integration Services packages in Azure.
Microsoft Azure - Introduction to microsoft's public cloudAtanas Gergiminov
This document provides an overview of Microsoft Azure, Microsoft's public cloud platform. It discusses Azure's infrastructure as a service (IaaS) and platform as a service (PaaS) offerings, as well as other services like compute, storage, networking, databases, web apps, and identity and access management. Usage statistics show that Azure trails only Amazon Web Services (AWS) in market share of public cloud providers. The document outlines how to sign up for a free Azure trial account and lists additional Microsoft resources for learning about Azure.
Accenture is undergoing a digital transformation to improve services for clients, employees, and the business. This involves streamlining processes, automating tasks, and using data analytics across the organization. The transformation includes developing integrated digital business services using tools like SAP to improve account management, sales, delivery, and other operations. It aims to provide employees with better tools and data to serve clients more efficiently. The multi-year change process focused on practical technology solutions and ensuring employees adopt new digital ways of working.
ETL Made Easy with Azure Data Factory and Azure DatabricksDatabricks
This document summarizes Mark Kromer's presentation on using Azure Data Factory and Azure Databricks for ETL. It discusses using ADF for nightly data loads, slowly changing dimensions, and loading star schemas into data warehouses. It also covers using ADF for data science scenarios with data lakes. The presentation describes ADF mapping data flows for code-free data transformations at scale in the cloud without needing expertise in Spark, Scala, Python or Java. It highlights how mapping data flows allow users to focus on business logic and data transformations through an expression language and provides debugging and monitoring of data flows.
Artificial intelligence has the potential to modernize and streamline the insurance industry by enhancing automation, reducing costs, lowering risks, and facilitating faster decision-making. Key reasons for the expected growth of AI use within insurance is the large amount of data available to train systems. While AI can benefit insurance through improved customer experiences, pricing, and claims processing, challenges to adoption include high costs, reliability issues, and increasing regulatory concerns around privacy and automated decision-making.
Technology Vision 2022: Communications Industry | Accentureaccenture
Accenture's Technology Vision 2022 for the Communications industry details the key building blocks of the Metaverse Continuum that every CSP needs to know. accntu.re/3l8fmT8
AI in Insurance: How to Automate Insurance Claim Processing with Machine Lear...Skyl.ai
About the webinar
Insurance companies are looking at technology to solve complexity created by presence of cumbersome processes and presence of multiple entities like actuaries, support team and customers in the claim processing cycle.
Today, a lot of insurance companies are opting for Machine Learning to simplify and automate the processes to reduce fraudulent claims, predict underwriting risks, improve customer relationship management. This automated insurance claim process can remove excessive human intervention or manual errors and can report the claim, capture damage, update the system and communicate with the customers by itself. This leads to an effortless process enabling clients to file their claims without much hassle.
In this webinar, we will discuss how insurers are increasingly relying on machine learning to improve claim processing efficiency and increase ROI.
What you'll learn
- How Insurance companies are using ML to drive more efficiency and business gain
- Best practices to automate machine learning models
- Demo: A deeper understanding of the end-to-end machine learning workflow for car damage recognition using Skyl.ai
The document is a presentation deck for Microsoft sellers to introduce Azure Cosmos DB to customers. It covers the challenges faced by modern app developers, how Cosmos DB addresses those challenges through its globally distributed database capabilities, and provides examples of customer use cases across different industries. The deck also highlights key features of Cosmos DB such as elastic scaling, multiple data models/APIs, security/compliance, and performance guarantees through service level agreements.
This document discusses the challenges of modern apps and how Microsoft's Azure cloud services provide solutions. It focuses on Azure Cosmos DB, a globally distributed database service that can scale massive amounts of data across any workload. Cosmos DB provides elastic scaling, guaranteed low latency, comprehensive security and compliance, and helps companies optimize operations and gain insights from IoT and big data.
Tour de France Azure PaaS 3/7 Stocker des informationsAlex Danvy
3 possibilités de stocker des données dans Azure :
- Evolution : Le compte de stockage est plus que jamais essentiel. Bien que basic, il ne cesse d'évoluer.
- Innovation : Le Cloud permet d'imaginer de nouveaux scénarios mettant à rude épreuve les technologies de stockage. Il faut parfois en inventer de nouvelles : Cosmos DB
- Open Source : S'il est possible de faire fonctionner les solutions Open Source dans des VM, celle n'apporte que très rarement de la valeur. Autant en laisser la gestion au fournisseur de Cloud. MySQL, PostegreSQL et Maria DB sont maintenant disponibles sous la forme de service managé.
DocumentDB is a fast, globally distributed, multi-model NoSQL database service. It provides automatic scaling of storage and throughput, high availability across regions, flexible data models, and developer productivity with support for SQL and JavaScript queries. Customers can use DocumentDB for building scalable applications that need to handle large volumes of data across any number of regions worldwide with low latency and high availability.
Azure Cosmos DB - NoSQL Strikes Back (An introduction to the dark side of you...Andre Essing
This document summarizes an introduction presentation about Azure Cosmos DB. It discusses key aspects of Cosmos DB including that it is a globally distributed, massively scalable database that supports multiple data models. It also covers request units, partitioning, indexing, consistency models, and other architectural aspects that allow Cosmos DB to elastically scale storage and throughput worldwide.
Can Your Mobile Infrastructure Survive 1 Million Concurrent Users?TechWell
When you’re building the next killer mobile app, how can you ensure that your app is both stable and capable of near-instant data updates? The answer: Build a backend! Siva Katir says that there’s much more to building a backend than standing up a SQL server in your datacenter and calling it a day. Since different types of apps demand different backend services, how do you know what sort of backend you need? And, more importantly, how can you ensure that your backend scales so you can survive an explosion of users when you are featured in the app store? Siva discusses the common scenarios facing mobile app developers looking to expand beyond just the device. He’ll share best practices learned while building the PlayFab and other companies’ backends. Join Siva to learn how you can ensure that your app can scale safely and affordably into the millions of concurrent users and across multiple platforms.
Revolutionizing the customer experience - Hello Engagement DatabaseDipti Borkar
This document discusses Couchbase's engagement database platform and its key features. It summarizes Couchbase as having a memory-first architecture that provides fast access to data and indexes in memory. It also supports features like flexible querying with N1QL, global indexing, full text search, mobile sync capabilities, and real-time analytics. The document outlines Couchbase's core design principles and how it provides a unified platform for key-value, query, search and analytics workloads across public and private clouds.
Azure SQL DB Managed Instances Built to easily modernize application data layerMicrosoft Tech Community
The document discusses Azure SQL Database Managed Instance, a new fully managed database service that provides SQL Server compatibility. It offers seamless migration of SQL Server workloads to the cloud with full compatibility, isolation, security and manageability. Customers can realize up to a 406% ROI over on-premises solutions through lower TCO, automatic management and scaling capabilities.
Using Mainframe Data in the Cloud: Design Once, Deploy Anywhere in a Hybrid W...Precisely
Your company is storing and processing more data in the cloud – and mainframe data is no exception. Whether you’re centralizing enterprise data for analytics, streaming it to real-time cloud-native applications, or archiving for regulatory compliance, you know your mainframe data has to be included. Unfortunately, as with most mainframe initiatives, this is easier said than done!
View this webcast on-demand to learn some practical strategies for leveraging mainframe data in the cloud. We will cover:
• Common use cases for mainframe data in the cloud
• Challenges in using mainframe data in the cloud – and how to solve them
• How to get started
"An introduction to Kx Technology - a Big Data solution", Kyra Coyne, Data Sc...Dataconomy Media
Kx Technology is an in-memory columnar database and programming system that is highly optimized for real-time streaming and historical time-series data analytics. It provides extreme performance at low latency and can scale to process massive data volumes without significant infrastructure. Kx has been widely adopted over two decades in the financial services industry for applications like market surveillance, risk management, and quantitative research.
Slides: Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Donna Burbank, Managing Director of Global Data Strategy, Ltd., will host a webinar series on data architecture strategies. The June 25th webinar will focus on the differences and alignment between enterprise architecture and data architecture. Enterprise architecture provides a visual blueprint of an organization's key assets and how they interrelate, including data, processes, applications and more. The webinar will discuss how data architecture is a critical component of enterprise architecture and how it can enhance business value.
Leapfrog into Serverless - a Deloitte-Amtrak Case Study | Serverless Confere...Gary Arora
This talk was delivered at the Serverless Conference in New York City in 2017. Deloitte and Amtrak built a Serverless Cloud-Native solution on AWS for real-time operational datastore and near real-time reporting data mart that modernized Amtrak's legacy systems & applications. With Serverless solutions, we are able leapfrog over several rungs of computing evolution.
Gary Arora is a Cloud Solutions Architect at Deloitte Consulting, specializing on Azure & AWS.
Azure provides cloud computing services including computing, analytics, networking, storage, and more. It offers virtual machines, databases, websites, and other services that can be accessed from anywhere and scaled up as needed. Azure aims to provide enterprise-grade services that are economical, scalable, and hybrid-ready to work with existing on-premises systems. It has data centers across the world and over 600,000 servers to provide its services globally at scale.
This document discusses a community conference focused on cloud computing. It promotes connecting, sharing, and learning at the event. Several speakers are highlighted including Rohan Kumar from Microsoft who will give a keynote on data platforms. The document discusses major trends converging around intelligence, cloud, big data and IoT. It promotes Microsoft solutions for optimizing IT and business transformation through an intelligent platform, self-managed services, a modern data platform, and integrated intelligence.
Digital transformation with microsoft data and ai MichaelRoenker
This document discusses how data, AI, and digital transformation can drive business value. It promotes Microsoft's data and AI platform for enabling organizations to optimize their data platforms, transform with analytics and AI, and differentiate with new applications. The platform provides security, flexibility to use any data from anywhere, and capabilities for the cloud, hybrid scenarios, and on-premises. Case studies show how various companies achieved benefits like improved performance, reduced costs, accelerated innovation, and new revenue through use of Microsoft's data and AI tools.
The document discusses big data and machine learning solutions on AWS. It covers why organizations use big data, challenges they face, and how AWS solutions like S3 data lakes, Glue, Athena, Redshift, Kinesis, Elasticsearch, SageMaker, and QuickSight can help overcome these challenges. It also discusses how big data drives machine learning and how AWS machine learning services work. Core tenets discussed include building decoupled systems, using the right tool for the job, and leveraging serverless services.
Businesses are generating more data than ever before.
Doing real time data analytics requires IT infrastructure that often needs to be scaled up quickly and running an on-premise environment in this setting has its limitations.
Organisations often require a massive amount of IT resources to analyse their data and the upfront capital cost can deter them from embarking on these projects.
What’s needed is scalable, agile and secure cloud-based infrastructure at the lowest possible cost so they can spin up servers that support their data analysis projects exactly when they are required. This infrastructure must enable them to create proof-of-concepts quickly and cheaply – to fail fast and move on.
This technical pitch deck summarizes SAP solutions on Microsoft Azure. It outlines challenges with on-premises SAP environments and how moving to SAP HANA in the cloud on Azure can enable faster processes, accelerated innovation, and 360-degree insights. It then covers the journey to migrating SAP landscapes to SAP HANA and Azure, including lifting SAP systems with any database to Azure, migrating to SAP HANA, and migrating to S/4HANA. Finally, it discusses how Azure enables insights from SAP and non-SAP data.
This document provides an overview of migrating NoSQL workloads and data to Azure Cosmos DB. It discusses the challenges of managing NoSQL databases on-premises or in IaaS and how Azure Cosmos DB addresses these with a fully managed database service. It also describes how Azure Cosmos DB supports global distribution, elastic scaling, low latency access and comprehensive SLAs. The document outlines options for migrating MongoDB and Cassandra workloads using Azure Cosmos DB APIs and shows the process is simple, requiring only connection strings and existing tools.
The document describes how ContosoAir is building a more innovative flight booking app using Microsoft technologies. It discusses 5 areas of improvement:
1. Using Azure serverless architecture and Cosmos DB to improve global performance.
2. Enabling real-time notifications and personalization using Functions and Cosmos DB.
3. Applying machine learning to Cosmos DB data to intelligently predict and notify customers of flight delays.
4. Triggering real-time notifications through Functions in event-driven scenarios like gate changes.
5. Automating customer service with bots and gaining insights from feedback via cognitive services APIs.
Azure Cosmos DB offers different pricing options depending on data replication and provisioned throughput needs. Customers can choose between single-master or multi-master replication to write data to one or multiple Azure regions. Reserved capacity offers up to 65% savings with 1- or 3-year commitments but is billed hourly based on provisioned request units. Throughput can be provisioned at the database or container level and shared or isolated respectively.
This document provides an overview of Linux support on Microsoft Azure. It highlights that Azure supports major Linux distributions like Red Hat Enterprise Linux and SUSE Linux. It also notes that over 40% of VM cores and 60% of marketplace images on Azure are Linux-based. The document discusses Azure services that support Linux like compute options, security, databases, tools for migration and management. It includes customer quotes about using Linux and open source tools on Azure and case studies of companies migrating Linux workloads to Azure.
This document discusses high performance computing (HPC) on Microsoft Azure. It begins with an overview of the HPC opportunity in the cloud, highlighting how the cloud provides elasticity and scale to accommodate variable computing demands. It then outlines Azure's value proposition for HPC, including its productive, trusted and hybrid capabilities. The document reviews the various HPC resources available on Azure like VMs, GPUs, and Cray supercomputers. It also discusses solutions for HPC like Azure Batch, Azure Machine Learning Compute, Azure CycleCloud and Avere vFXT. Example industry use cases are provided for automotive, financial services, manufacturing, media/entertainment and oil/gas. The summary reiterates that Azure is uniquely positioned
The document discusses how global business value derived from artificial intelligence (AI) will reach $3.9 trillion in 2022. It notes that AI will generate business value through decision support, virtual agents, decision automation, smart products, and other areas.
Azure Machine Learning Services provides an end-to-end, scalable platform for operationalizing machine learning models. It allows users to deploy models everywhere from containers and Kubernetes to SQL Datawarehouse and Cosmos DB. It also offers tools to boost data science productivity, increase experimentation, and automate model retraining. The platform seamlessly integrates with Azure services and is built to deploy models globally at scale with high availability and low latency.
This document describes Microsoft's cognitive search capabilities for enriching and annotating content through natural language processing and computer vision. It discusses how unstructured data like text, images and videos can be ingested from various sources and stores, enriched with built-in and custom cognitive skills, and indexed for exploration and search. The enriched and annotated documents can then be used to train and deploy custom machine learning models.
This document provides an overview of Microsoft's Azure IoT platform and services. It describes Azure services for ingesting and analyzing IoT device data like IoT Hub, Stream Analytics, Machine Learning, and Time Series Insights. It also outlines edge computing capabilities with IoT Edge and device management solutions. Finally, it showcases several IoT solutions and provides links to learn more about building IoT applications on Azure.
The document discusses how organizations can leverage cloud, data, and AI to gain competitive advantages. It notes that 80% of organizations now adopt cloud-first strategies, AI investment increased 300% in 2017, and data is expected to grow dramatically. The document promotes Microsoft's cloud-based analytics services for harnessing data at scale from various sources and types. It provides examples of how companies have used these services to improve customer experience, reduce costs, speed up insights, and gain operational efficiencies.
Azure Database Services for MySQL PostgreSQL and MariaDBNicholas Vossburg
This document summarizes the Azure Database platform for relational databases. It discusses the different service tiers for databases including Basic, General Purpose, and Memory Optimized. It covers security features, high availability, scaling capabilities, backups and monitoring. Methods for migrating databases to Azure like native commands, migration wizards, and replication are also summarized. Best practices for achieving performance are outlined related to network latency, storage, and CPU.
Microsoft Cloud Adoption Framework for Azure: Thru Partner Governance WorkshopNicholas Vossburg
The document discusses establishing governance for cloud adoption using the Microsoft Cloud Adoption Framework. It recommends framing governance as a way to mitigate business risks. An assessment of the current and desired future states helps establish a vision. A minimally viable product (MVP) provides an initial governance foundation focusing on resource organization, consistency and basic controls using tools like Azure Blueprints and Policies. The governance approach then evolves further with each release to better align with cloud adoption.
Microsoft Cloud Adoption Framework for Azure: Governance ConversationNicholas Vossburg
This document outlines Microsoft's Cloud Adoption Framework (CAF) governance model for governing cloud adoption. It recommends starting with an assessment of the current state and future vision. Then establish a Minimum Viable Product (MVP) for governance using core Azure services like management groups, subscriptions, resource groups, Azure Policy and role-based access control. The MVP should focus on key areas like resource tagging, grouping and security baselines. Governance then evolves by maturing the MVP with each cloud release to better align with cloud adoption and IT functions.
The Azure Migration Program (AMP) provides customers with guidance and resources to accelerate their cloud migration journey to Azure. It addresses customer needs for a singular migration approach, technical guidance, best practices, support for change management, and a one-stop shop. The program includes proven guidance from Microsoft, offers and incentives to defray costs, infrastructure and data foundations, migration planning and execution assistance, and specialized migration partners for expert guidance. Customers submit a simple form and within a few days will receive a response on how Microsoft can help with their specific migration project through self-serve resources, direct technical assistance, or an AMP offer.
Microsoft is providing information about the end of support dates for various products. Windows Server 2008 and 2008 R2 will reach end of support on January 14, 2020. Resources are provided about upgrading to newer versions like Windows Server 2019 by its end of support date in October 2020. Links are also included about Azure services that can help with migration, security, and compliance like GDPR when transitioning off end of support servers.
How Netflix Builds High Performance Applications at Global ScaleScyllaDB
We all want to build applications that are blazingly fast. We also want to scale them to users all over the world. Can the two happen together? Can users in the slowest of environments also get a fast experience? Learn how we do this at Netflix: how we understand every user's needs and preferences and build high performance applications that work for every user, every time.
Fluttercon 2024: Showing that you care about security - OpenSSF Scorecards fo...Chris Swan
Have you noticed the OpenSSF Scorecard badges on the official Dart and Flutter repos? It's Google's way of showing that they care about security. Practices such as pinning dependencies, branch protection, required reviews, continuous integration tests etc. are measured to provide a score and accompanying badge.
You can do the same for your projects, and this presentation will show you how, with an emphasis on the unique challenges that come up when working with Dart and Flutter.
The session will provide a walkthrough of the steps involved in securing a first repository, and then what it takes to repeat that process across an organization with multiple repos. It will also look at the ongoing maintenance involved once scorecards have been implemented, and how aspects of that maintenance can be better automated to minimize toil.
What's Next Web Development Trends to Watch.pdfSeasiaInfotech2
Explore the latest advancements and upcoming innovations in web development with our guide to the trends shaping the future of digital experiences. Read our article today for more information.
Are you interested in learning about creating an attractive website? Here it is! Take part in the challenge that will broaden your knowledge about creating cool websites! Don't miss this opportunity, only in "Redesign Challenge"!
Coordinate Systems in FME 101 - Webinar SlidesSafe Software
If you’ve ever had to analyze a map or GPS data, chances are you’ve encountered and even worked with coordinate systems. As historical data continually updates through GPS, understanding coordinate systems is increasingly crucial. However, not everyone knows why they exist or how to effectively use them for data-driven insights.
During this webinar, you’ll learn exactly what coordinate systems are and how you can use FME to maintain and transform your data’s coordinate systems in an easy-to-digest way, accurately representing the geographical space that it exists within. During this webinar, you will have the chance to:
- Enhance Your Understanding: Gain a clear overview of what coordinate systems are and their value
- Learn Practical Applications: Why we need datams and projections, plus units between coordinate systems
- Maximize with FME: Understand how FME handles coordinate systems, including a brief summary of the 3 main reprojectors
- Custom Coordinate Systems: Learn how to work with FME and coordinate systems beyond what is natively supported
- Look Ahead: Gain insights into where FME is headed with coordinate systems in the future
Don’t miss the opportunity to improve the value you receive from your coordinate system data, ultimately allowing you to streamline your data analysis and maximize your time. See you there!
In this follow-up session on knowledge and prompt engineering, we will explore structured prompting, chain of thought prompting, iterative prompting, prompt optimization, emotional language prompts, and the inclusion of user signals and industry-specific data to enhance LLM performance.
Join EIS Founder & CEO Seth Earley and special guest Nick Usborne, Copywriter, Trainer, and Speaker, as they delve into these methodologies to improve AI-driven knowledge processes for employees and customers alike.
Video traffic on the Internet is constantly growing; networked multimedia applications consume a predominant share of the available Internet bandwidth. A major technical breakthrough and enabler in multimedia systems research and of industrial networked multimedia services certainly was the HTTP Adaptive Streaming (HAS) technique. This resulted in the standardization of MPEG Dynamic Adaptive Streaming over HTTP (MPEG-DASH) which, together with HTTP Live Streaming (HLS), is widely used for multimedia delivery in today’s networks. Existing challenges in multimedia systems research deal with the trade-off between (i) the ever-increasing content complexity, (ii) various requirements with respect to time (most importantly, latency), and (iii) quality of experience (QoE). Optimizing towards one aspect usually negatively impacts at least one of the other two aspects if not both. This situation sets the stage for our research work in the ATHENA Christian Doppler (CD) Laboratory (Adaptive Streaming over HTTP and Emerging Networked Multimedia Services; https://athena.itec.aau.at/), jointly funded by public sources and industry. In this talk, we will present selected novel approaches and research results of the first year of the ATHENA CD Lab’s operation. We will highlight HAS-related research on (i) multimedia content provisioning (machine learning for video encoding); (ii) multimedia content delivery (support of edge processing and virtualized network functions for video networking); (iii) multimedia content consumption and end-to-end aspects (player-triggered segment retransmissions to improve video playout quality); and (iv) novel QoE investigations (adaptive point cloud streaming). We will also put the work into the context of international multimedia systems research.
Transcript: Details of description part II: Describing images in practice - T...BookNet Canada
This presentation explores the practical application of image description techniques. Familiar guidelines will be demonstrated in practice, and descriptions will be developed “live”! If you have learned a lot about the theory of image description techniques but want to feel more confident putting them into practice, this is the presentation for you. There will be useful, actionable information for everyone, whether you are working with authors, colleagues, alone, or leveraging AI as a collaborator.
Link to presentation recording and slides: https://bnctechforum.ca/sessions/details-of-description-part-ii-describing-images-in-practice/
Presented by BookNet Canada on June 25, 2024, with support from the Department of Canadian Heritage.
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/07/intels-approach-to-operationalizing-ai-in-the-manufacturing-sector-a-presentation-from-intel/
Tara Thimmanaik, AI Systems and Solutions Architect at Intel, presents the “Intel’s Approach to Operationalizing AI in the Manufacturing Sector,” tutorial at the May 2024 Embedded Vision Summit.
AI at the edge is powering a revolution in industrial IoT, from real-time processing and analytics that drive greater efficiency and learning to predictive maintenance. Intel is focused on developing tools and assets to help domain experts operationalize AI-based solutions in their fields of expertise.
In this talk, Thimmanaik explains how Intel’s software platforms simplify labor-intensive data upload, labeling, training, model optimization and retraining tasks. She shows how domain experts can quickly build vision models for a wide range of processes—detecting defective parts on a production line, reducing downtime on the factory floor, automating inventory management and other digitization and automation projects. And she introduces Intel-provided edge computing assets that empower faster localized insights and decisions, improving labor productivity through easy-to-use AI tools that democratize AI.
Blockchain and Cyber Defense Strategies in new genre timesanupriti
Explore robust defense strategies at the intersection of blockchain technology and cybersecurity. This presentation delves into proactive measures and innovative approaches to safeguarding blockchain networks against evolving cyber threats. Discover how secure blockchain implementations can enhance resilience, protect data integrity, and ensure trust in digital transactions. Gain insights into cutting-edge security protocols and best practices essential for mitigating risks in the blockchain ecosystem.
UiPath Community Day Kraków: Devs4Devs ConferenceUiPathCommunity
We are honored to launch and host this event for our UiPath Polish Community, with the help of our partners - Proservartner!
We certainly hope we have managed to spike your interest in the subjects to be presented and the incredible networking opportunities at hand, too!
Check out our proposed agenda below 👇👇
08:30 ☕ Welcome coffee (30')
09:00 Opening note/ Intro to UiPath Community (10')
Cristina Vidu, Global Manager, Marketing Community @UiPath
Dawid Kot, Digital Transformation Lead @Proservartner
09:10 Cloud migration - Proservartner & DOVISTA case study (30')
Marcin Drozdowski, Automation CoE Manager @DOVISTA
Pawel Kamiński, RPA developer @DOVISTA
Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner
09:40 From bottlenecks to breakthroughs: Citizen Development in action (25')
Pawel Poplawski, Director, Improvement and Automation @McCormick & Company
Michał Cieślak, Senior Manager, Automation Programs @McCormick & Company
10:05 Next-level bots: API integration in UiPath Studio (30')
Mikolaj Zielinski, UiPath MVP, Senior Solutions Engineer @Proservartner
10:35 ☕ Coffee Break (15')
10:50 Document Understanding with my RPA Companion (45')
Ewa Gruszka, Enterprise Sales Specialist, AI & ML @UiPath
11:35 Power up your Robots: GenAI and GPT in REFramework (45')
Krzysztof Karaszewski, Global RPA Product Manager
12:20 🍕 Lunch Break (1hr)
13:20 From Concept to Quality: UiPath Test Suite for AI-powered Knowledge Bots (30')
Kamil Miśko, UiPath MVP, Senior RPA Developer @Zurich Insurance
13:50 Communications Mining - focus on AI capabilities (30')
Thomasz Wierzbicki, Business Analyst @Office Samurai
14:20 Polish MVP panel: Insights on MVP award achievements and career profiling
AC Atlassian Coimbatore Session Slides( 22/06/2024)apoorva2579
This is the combined Sessions of ACE Atlassian Coimbatore event happened on 22nd June 2024
The session order is as follows:
1.AI and future of help desk by Rajesh Shanmugam
2. Harnessing the power of GenAI for your business by Siddharth
3. Fallacies of GenAI by Raju Kandaswamy
Performance Budgets for the Real World by Tammy EvertsScyllaDB
Performance budgets have been around for more than ten years. Over those years, we’ve learned a lot about what works, what doesn’t, and what we need to improve. In this session, Tammy revisits old assumptions about performance budgets and offers some new best practices. Topics include:
• Understanding performance budgets vs. performance goals
• Aligning budgets with user experience
• Pros and cons of Core Web Vitals
• How to stay on top of your budgets to fight regressions
How RPA Help in the Transportation and Logistics Industry.pptxSynapseIndia
Revolutionize your transportation processes with our cutting-edge RPA software. Automate repetitive tasks, reduce costs, and enhance efficiency in the logistics sector with our advanced solutions.
Data Protection in a Connected World: Sovereignty and Cyber Securityanupriti
Delve into the critical intersection of data sovereignty and cyber security in this presentation. Explore unconventional cyber threat vectors and strategies to safeguard data integrity and sovereignty in an increasingly interconnected world. Gain insights into emerging threats and proactive defense measures essential for modern digital ecosystems.
2. M O D E R N A P P S
FAC E N E W
C H A L L E N G E S
Processing and analyzing large, complex data
Offering low-latency to global users
Managing and syncing data distributed around the globe
Delivering highly-responsive, real-time personalization
Scaling both throughput and storage based on global demand
Modernizing existing apps and data
3. MongoDBTable API
Turnkey global
distribution
Elastic scale out
of storage & throughput
Guaranteed low latency
at the 99th percentile
Comprehensive
SLAs
Five well-defined
consistency models
A Z U R E C O S M O S D B
DocumentColumn-family
Key-value Graph
Core (SQL) API
4. A Z U R E C O S M O S D B U S E C A S E S
NoSQL modernization and migration to Azure Cosmos DB
Handle peak sales
periods with ease
• Retail and e-commerce
Apps
• Modern apps that
need to elastically
scale to handle spikes
in traffic
Deliver relevant real-
time personalization
• Any modern
customer facing
application
Leverage IoT
telemetry to build
differentiated
experiences
• Manage Device
telemetry
• Device Registry
Deliver high-quality
App experiences
globally at any scale
• Multi-Player games
Modernize and build new apps with real-time personalization
99.999 HA for reads and writes, extremely low latency at any scale worldwide
Top sectors including Retail, IOT/ Manufacturing, Gaming, and ISV; emerging sectors include Financial
Services and Health Care.
5. A Z U R E C O S M O S I N D U S T R Y S C E N A R I O S
• Retail
• Order Processing Pipeline
• Product Catalog
• Personalization
• Real-time analytics
• Financial Services
• Audit Trail
• Tax Forms
• Risk Analysis
• IoT + Manufacturing
• Device Telemetry
• Device Registry
• Supply Chain Management
• ISV
• Content Management (CMS)
• Data Interchange
• Dev Ops Dependency Management
• Knowledge Graphs
• Gaming
• Social Clans/Guilds
• Leaderboards
• Messaging
• Healthcare
• Data Interchange (HL7 FHIR)
7. E A S Y TO M I G R AT E N O S Q L A P P S TO A Z U R E C O S M O S
D B
Make data modernization easy with seamless migration of NoSQL
workloads to cloud.
• Azure Cosmos DB MongoDB API, Cassandra API, and SQL
API bring app data from existing NoSQL deployments
• Leverage existing tools, drivers, and libraries, and continue
using existing apps’ current SDKs
• Turnkey geo-replication
• No infrastructure or VM management required
DynamoDB
MongoDB
Couchbase
CouchDB
Neo4j
HBase
Cassandra
8. MIGRATE CASSANDRA/DATASTAX WORKLOADS TO AZURE COSMOS DB
Questions to ask customers with Cassandra workloads
• Does the database have high costs of infrastructure, licenses and
database management
• How much time is spent managing the database vs focusing on
innovation? It is hard to manage and configure Cassandra database is
hard and time-consuming including:
• Capacity Management,
• Performance Management
• Availability Management
• Are you trying to achieve Global scale ? – Building high performing
scalable apps across multiple regions is difficult and time consuming
KEY SCENARIO CONVERSATIONS
Competitive TCO
• Up to 2-6X in saving when moving from On-Premise/IaaS Cassandra to
Cosmos DB
• There is no DevOps or license fees
• You don’t have to worry about high costs for hardware and database
maintenance
• Elastically scale up and down based on your requirements
Fully Managed Service
• Born in the cloud database service and reduces the need to manage and
configure the database
• Cassandra developers can leverage existing drivers, libraries, and tools
• Automatic indexing and partitioning
• Elastic scale-out
Guarantees high
performance
worldwide
• Enterprise-level SLAs that guarantee 99.999 HA for reads and writes and
millisecond latency worldwide
• High performance at global scale - turnkey global distribution allowing
developers to replicate data anywhere in the world in minutes
• Multi-Master support across all Azure regions
• Only database that offers choice of consistency models
• Enterprise-grade security
Target Audience:
• ITDM
• Head of development
• Architects
High Potential Industries:
• Retail
• Manufacturing (IOT scenarios)
• Automotive
• Financial Services
• Gaming
Top reasons for customers to migrate to
Azure Cosmos DB
• Competitive TCO - Up to 2- 6X in saving when
moving from On-Premise/IaaS Cassandra to
Cosmos DB
• Offers a fully managed service - reduces the
need to manage and configure the database
• Cosmos DB guarantees high performance
anywhere in the world - with industry leading
SLAs for high availability and low latency
3rd Party Tools & Services to support Migration
• Inmanis Data
• Striim
Top resources to support you with this scenario
• NoSQL Migration to Azure SafePassage Program
• NoSQL to DB Migration Guide
• NoSQL Migration FAQ
• Cosmos DB SI Partner List
• FY19 NoSQL Migration Offer
• Cosmos DB Infopedia Page
Successful Customers
9. M I G R AT I N G
C A S S A N D R A
W O R K LO A D S
What was the app they migrated?
• SPOC is a notification service for
Symantec endpoints.
• Every Symantec products (SEP, Norton
security product’s) endpoint will register
with SPOC and they open a constant
long poll to the SPOC server.
• For every write done SPOC, there will
be a subset of reads happening from
clients based on channel.
• Whenever there are new
changes/updates come to SPOC, they
are propagated to all connected eligible
devices.
Symantec is migrating multiple workloads
from DSE Cassandra.
• Leveraging multiple APIs depending on the
workload requirements
• Chose Azure Cosmos DB because it offers
fully managed service, reduces pain of
managing and scaling the database and
SLAs around high availability and low
latency.
10. Bentley is an ISV with several cloud services for
manufacturing organizations. As part of this they need to
ingest construction data from several products and then
consolidated into a single persistent storage.
They turned to Azure Cosmos DB for its fully managed
and globally scalable service and its compatibility with
MongoDB.
case study here
M I G R AT I N G F R O M
M O N G O D B
10
• Trying to data can be ingested from several products and
then consolidated into a single persistent storage was time
consuming and challenging.
View the case study here
‘Building a flexible, scalable data layer with Azure Cosmos DB will
enable us to deliver actionable insights to our users.” says Phil
Christensen, Senior Vice President for Reality Modeling & Cloud
Services at Bentley Systems.
• A highly performant and globally scalable database
service
• Fully manage service allow Bentley to be more agile and
reduced need for data management
• Azure Cosmos DB offers a dynamic schema which allowed
Bentley to ingest data from multiple sources and creating
maps between the various schemas in it.
12. H A N D L E P E A K S A L E S
P E R I O D S W I T H E A S E
Offer customers fast and reliable service quality
during seasonal and other high-traffic sales periods.
• Instant, elastic scaling handles traffic and sales
bursts
• Provisioned throughput ensures predictable
performance for mission critical microservices
(e.g. shopping cart)
• Low-latency data access from anywhere in the
world for fast, robust user experiences
• High availability across multiple data centers
Walmart Labs (aka jet.com) ensures reliable app experience for customers
on Black Friday, Cyber Monday, and other high traffic periods
13. P RO D U C T C ATA LO G
R E F E R E N C E
A RC H I T E C T U R E
Item Color Microwavesafe Liquidcapacity CPU Memory Storage
Geek mug Graphite Yes 16ox ??? ??? ???
Coffee
Bean mug
Tan No 12oz ??? ??? ???
Surfacebook Gray ??? ??? 3.4
GHzIntel
SkylakeC
ore i7-
6600U
16GB 1 TB SSD
14. O R D E R P RO C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
15. O R D E R P RO C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/order-processing
16. D E L I V E R R E L E VA N T R E A L -
T I M E R E C O M M E N D AT I O N S
Help customers discover items they’ll love with real-
time personalization and product recommendations.
• Machine learning models generate real-time
recommendations across product catalogues
• High volumes of product data can be analyzed in
milliseconds
• Low-latency ensures high app performance
worldwide
• Tunable data consistency models for rapid insight
Online Recommendations Service
HOT path
Offline Recommendations Engine
COLD path
ASOS deliver personalized shopping experiences and real-time order updates to 15 Million
customers. Helping them grow and win with millennial shoppers.
17. R E C O M M E N D AT I O N E N G I N E
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/recommendation-engine
18. R E A L - T I M E A N A LY T I C S
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/retail-analytics
20. Diverse and unpredictable IoT sensor workloads
require a responsive data platform
• Real-time vehicle diagnostics
• Instant elastic scaling
• No loss in ingestion or query performance
L E V E R A G E I OT
T E L E M E T R Y TO B U I L D
D I F F E R E N T I AT E D
E X P E R I E N C E S
Azure Cosmos DB was chosen due to its
ability to ingest data at massive scale
with high availability (99.99%) guarantee.
21. S T R E A M P RO C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/stream-processing-databricks
22. S T R E A M P RO C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/stream-processing-sa
23. Find a better way to monitor remote wells and
collect data on performance
• Must be cost efficient
• Unified device management and streaming
• Automate IOT and analytics
I OT, B I G D ATA
O P T I M I Z E O P E R AT I O N S
AT E X X O N M O B I L
S U B S I D I A R Y
We had a team of five people working
on this, and they built it from scratch.
The ease of use of the Azure services
and the support we got from Microsoft
made that possible. .
24. S T R E A M P RO C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
25. S T R E A M P RO C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/streaming-scale-cosmosdb
27. Need a database that seamlessly responds to
massive scale and performance demands
• Multi-player game play with low latency
• Instant capacity scaling from launch onward
• Uninterrupted global user experience
D E L I V E R H I G H - Q U A L I T Y
E X P E R I E N C E S AT A N Y
S C A L E G LO B A L LY
The Walking Dead: No Man’s Land chose Azure Cosmos
DB because of its extremely low latency and massive scale
worldwide.
28. L E A D E R B OA R D S
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/azure-gaming
29. G A M E A N A LY T I C S
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/azure-gaming
31. Fidelity chose Azure Cosmos DB due to the
Ease global distribution, ability to scale and
fully managed service reducing the database
management overhead.
F I D E L I T Y B U I L D
M O R TG A G E I N S U R A N C E
A P P TO E N H A N C E
C U S TO M E R E X P E R I E N C E
Fidelity built a new application – EXOS – it is the only mobile
digital mortgage application designed specifically to extend and
enhance every critical consumer touchpoint throughout the
entire mortgage lending life cycle.
• EXOS offers a real-time personalized experience for
customers across the entire mortgage process including
• Appointment scheduling and communications –
enhancing customer experience and process .
• Ensuring consistent , personalized and accurate
information for customer throughout the process.
• EXOS Closing offers unmatched consumer satisfaction
and transparency in to the closing process.
32. Need a database that can handle any schema and
adapt quickly to rapid changes
• Financial SAAS engine with no dev ops
• Super fast to handle financial data
• Scalable on demand, globally distributed
A F I N A N C I A L T R E N D
S A A S E N G I N E F O R
I N V E S TO R S
Business models are under attack, especially in
the financial industry. Azure Cosmos DB is a
technology that can adapt, evolve, and allow a
business to innovate faster in order to turn
opportunities into strategic advantages.
Internet
Mobile
Browser
Application
Insights
App Service
Azure SQL
database
Storage (Azure)
Cosmos DB Azure
Functions
External
Services
33. Steady state - 10M transactions/day, peak hours -
3-4K transactions/sec
• Financial SAAS engine with no dev ops
• Super fast to handle financial data
• Scalable on demand, globally distributed
R E A L - T I M E PAY M E N T S
P I P E L I N E
Centralize payment pipelines, build real time
processing, analytics. Goal to introduce a
common pipeline accepting transactions from
all different sources and distributing them to
the right pipeline and also other sources like
analytics.
34. S E C U R I T I E S P R O C E S S I N G
R E F E R E N C E
A RC H I T E C T U R E
35. I M AG E C L A S S I F I C AT I O N
R E F E R E N C E
A RC H I T E C T U R E
https://aka.ms/image-processing
37. World’s third largest mapping agency
• Support for spatial queries and standards.
• Identify every roof top in Britain.
• Scalability and flexibility to handle millions of
properties.
M A P S O U T
S U C C E S S F U L S T R AT E G Y
W I T H C O S M O S D B
The solution can identify roof types
of all 35.7 million properties in
Britain in less than 24 hours with
95% accuracy.
38. Bentley is an ISV with several cloud services for
manufacturing organizations. As part of this they need to
ingest construction data from several products and then
consolidated into a single persistent storage.
They turned to Azure Cosmos DB for its fully managed
and globally scalable service and its compatibility with
MongoDB.
case study here
M I G R AT I O N F R O M
M O N G O D B
38
• Trying to data can be ingested from several products and
then consolidated into a single persistent storage was time
consuming and challenging.
View the case study here
‘Building a flexible, scalable data layer with Azure Cosmos DB will
enable us to deliver actionable insights to our users.” says Phil
Christensen, Senior Vice President for Reality Modeling & Cloud
Services at Bentley Systems.
• A highly performant and globally scalable database
service
• Fully manage service allow Bentley to be more agile and
reduced need for data management
• Azure Cosmos DB offers a dynamic schema which allowed
Bentley to ingest data from multiple sources and creating
maps between the various schemas in it.
39. M I G R AT I O N F R O M
C A S S A N D R A What was the app they migrated?
• SPOC is a notification service for
Symantec endpoints.
• Every Symantec products (SEP, Norton
security product’s) endpoint will register
with SPOC and they open a constant
long poll to the SPOC server.
• For every write done SPOC, there will
be a subset of reads happening from
clients based on channel.
• Whenever there are new
changes/updates come to SPOC, they
are propagated to all connected eligible
devices.
Symantec is migrating multiple workloads
from DSE Cassandra.
• leverage multiple APIs depending on the
workload requirements
• Chose Azure Cosmos DB because it offers
fully managed service, reduces pain of
managing and scaling the database and
SLAs around high availability and low
latency.
42. S U M M A R Y O F T O P P E R F O R M I N G I N D U S T R Y U S E C A S E S
Industry Top Challenges Use Cases Why Azure Cosmos DB Won Azure Cosmos DB Customers
Retail /
e-commerce
• Ensure high performing app
regardless of seasonal demands and
peak traffic
• Deliver differentiated customer
experiences with personalization
• Ability to be agile and ensure faster
time to market
• Order and payment
Processing
• Retail-time Personalization
• Inventory Management
• Product Catalogs
• Elastic scale to handle seasonal traffic
• Guarantees high availability and low
latency access across anywhere in the
world
• Schema-agnostic storage and automatic
indexing to handle diverse product
catalogs, orders, and events
Manufacturing
/IOT
• Leverage data from multiple devices to
build differentiated experiences/
enhance processes or leverage for
analytics
• Ingest huge volumes of data from
multiple sources worldwide
• Ability to be agile and able to quickly
respond to issues
• Device Telemetry
• Device Registry
• Dependency
• High scalability to ingest large # of events
coming from many devices
• Low latency queries and changes feeds for
responding quickly to anomalies
• Schema-agnostic storage and automatic
indexing to support dynamic data coming
from many different generations of devices
• Guarantees high availability and low latency
across multiple data centers
Gaming • Ensure high quality game experience
for large volumes users and handle
bursts of traffic
• Create fast and responsive gameplay
for users all over the world
• Agility to allows teams to iterate quickly
to fit a demanding ship schedule
• Support leaderboards and social
gameplay
• Social Clans / Guilds
• Leaderboards
• Messaging
• Elastic scale to handle seasonal traffic
• Low-latency queries to support responsive
gameplay for a global user-base
• Schema-agnostic storage and indexing
allows teams to iterate quickly to fit a
demanding ship schedule
• Change-feeds to support leaderboards and
social gameplay
Financial
Services
• Audit Trail
• Tax Forms
• Underwriting / Risk Analysis
43. • Solutions Architectures
• ASOS case study (retail, real-time personalization)
• Next Games case study (gaming, elastic scaling)
• Johnson Controls story (IoT)
R E S O U R C E S