The role of cloud computing in the Big Data Cloud is what? Cloud is the most commonly used words in the past few years. Everyone knows about the cloud, but in this article we will discuss in the context of big data cloud. Cloud computing is one way to provide computing resources to be shared for applications that require flexible resources. These resources include applications, compute, storage, networking, development, and other deployment platforms. The foundation of cloud computing is that it share resources and distribution to end users as one service. The examples of cloud computing and big data, Google and Amazon.com. Both offer big data with the help of cloud. There cloud deployment model 2 different: 1) Public Cloud and 2) Private Cloud. Public Cloud Public Cloud is lower ng taeu cloud built by suppliers commercial grade (Amazon, Rackspace, ...) to create 1 data center high scalability helps hide complex infrastructure with customers and provide various services. Private Cloud Private Cloud is the cloud infrastructure is built by one organization, self-management they scalability of internal data centers. This is a quick comparison between Public Cloud and Private Cloud from Wikipedia: Hybrid Cloud Hybrid Cloud is a cloud infrastructure is built with composed of two or more of public and private clouds as cloud. Hybrid cloud brings the best of cloud deployment models. Cloud and big data - common characteristics There are many characteristics of Cloud Architecture and Cloud Computing which is also important for the big data storage. List the characteristics calculation of cloud computing that big important data: Scalability Elasticity Resource Pooling Ad-hoc to Setup Infastructure Low Cost Pay on Use or Pay as You Go Highly Availability the leading supplier to big data cloud Amazon Amazon is said to house providing Infrastructure as a Service (IaaS) are the most popular. The history of this interesting start. They start with one pile of business supporting infrastructure of their own. Gradually they master their resources are underutilized in most of the time. They decided to maximize the resources are and so they launched the Amazon Elastic Compute Cloud service (Amazon EC2) in 2006. Their products have grown a lot in recent times and now it has become one of the core business next to retailers. Amazon also offers services in the Amazon Web services data big. This is a list of services in the Amazon Web services: Amazon Elastic MapReduce - handle very large volumes of data . Amazon DynammoDB - NoSQL database service. Amazon Simple storage services (S3) - data services online store Amazon high performance computing - providing high-performance computing clusters Amazon RedShift - service data warehouse scalability petabytes Google Although Google Search Engine is known for, we also know the company can offer more than that. Google Compute Engine - provides security calculations, dynamic data center energy use efficiency . Google big query - enabling SQL-like query runs with very large data sets. Google Prediction API - machine learning tools based on cloud other providers Besides Amazon and Google, we have multiple suppliers about big data. Microsoft also joined with Microsoft Azure big data. Additionally, Rackspace and NASA, OpenStack together started. The goal of OpenStack is to provide scalable cloud can run on any hardware. It should monitor cloud-based solutions provide excellent integration with one story and a very big economic data to make give. However, there are some things to consider when deploying big data solutions in the cloud. Data Integrity Initial Cost Recurring Cost Performance Data Access Security Location Compliance Each company will have big data approach different and the rules and different rules. Based on various factors, can install big data solutions their own customized on 1 cloud.
đang được dịch, vui lòng đợi..