Cloud Container News: Ubuntu OpenStack, Tata Communications, JFrog, Open BMC, & Kaggle

Written by: , Mar. 22, 2018

Container-as-a-Service (CaaS), SSD Storage, & Open Baseboard Management Controllers

Ubuntu OpenStack released statistics last week which positioned the Linux software distribution as #1 overall in market share for cloud data center management built using OpenStack for network administration. Ubuntu claimed 55% of the total business investment in this sector in 2017 over rival enterprise Linux platform distributions from Red Hat OpenShift and SUSE Cloud 7 with MicroOS. In India, the software development outsourcing firm Tata Communications announced new support for a cloud Container-as-a-Service (CaaS) platform developed by their programming teams as a solution for:

  • Enterprise companies like telecoms, banking/finance, manufacturing, or ecommerce to adopt quickly in database modernization;
  • The production of custom-coded web/mobile apps with high performance cloud hosting support at the highest levels of user traffic;
  • Managing public/private/hybrid cloud network hardware for web publishing, mobile application, & SaaS product requirements.

Tata Communications continues a trend launched by Infosys and Wipro in building proprietary private cloud software solutions that can be used by outsourcing teams in supporting many enterprise corporations in operations simultaneously. Integrator companies can install Red Hat OpenShift, Ubuntu OpenStack, SUSE Cloud 7, CoreOS, or RancherOS with Docker virtualization to manage complex solutions for enterprise clients using open source software tools. Other major announcements this week included new VMware, IBM, & AWS collaboration in cloud platform solutions; Microsoft establishing new SSD/BMC standards for hardware manufacturers & data centers (Project Denali/Project Cerberus); and continued fall-out from insiders over Google's acquisition of the data science company Kaggle, including their expert programming staff, archives of research data, & millions of AI/ML/DL platform customers. Apache Hadoop adoption is still growing quickly as a solution for "big data" requirements, including IoT, ecommerce, self-driving cars, cloud network analytics, & search technology.

Ubuntu OpenStack: 55% of Market Share vs. Red Hat OpenShift & SUSE Cloud 7 Distros

OpenStack was developed by Rackspace & NASA around 2010 as an advanced data center operating system for large-scale cloud computing requirements. The platform is modular and extensible, so that companies or other organizations can customize it according to their specific data center or cloud hosting requirements, i.e. use for web servers to make information public or host database servers for internal operations, etc. Third-party software development companies can build tools that extend OpenStack in new ways: network monitoring, load balancing, remote storage, APIs, web server administration, data analytics, firewalls, encryption, & security. Ubuntu is now "the most widely used Linux distribution for OpenStack." AT&T, Bloomberg, Paypal, eBay, Sky, & Walmart currently manage their data centers and cloud hosting for web/mobile apps with Canonical using Ubuntu/OpenStack at the highest levels of web traffic. The combination of the next 5 competitors in the sector: Red Hat, SUSE, HP, Oracle, & IBM do not even equal the total installed user base of Ubuntu when compared to the majority of cloud server hardware running OpenStack in data center production today for enterprise-level web hosting services, primarily on private cloud architecture for Fortune 500 companies. Currently, the main open source Linux distributions of OpenStack are:

The advantage of enterprise data center adoption for software and hardware development companies in IT is based on the fact that the corporations with the largest business operations have also the highest budgetary spend for the purchase of cloud services. Other than large government and education institutions, the Fortune 500 corporate IT budgets represent the most lucrative contracts for IT hardware, software development companies, & cloud service providers. Integrator firms have expanded in the OpenStack ecosystem, releasing software code under subscription fees or licensed-use rates that adds more complex options for business data center deployments, for example: web hosting company targeted installs of OpenStack with a Linux OS and container orchestration engine. Integrator companies deliver consulting, contracting, software development, or DevOps outsourcing services to Fortune 500 businesses as outsourcing solutions providers. Tata Communications recently announced a Container-as-a-Service (CaaS) platform based on IZO™ Private Cloud that will be used to support IT services in enterprise business organizations. These private cloud services compete with public cloud options for IT organizations with different levels of synergy & cost of investment for complex organizations that need distributed team management tools for CI/CD software production that may include thousands of products or millions of hosted domains.

Integrating CI/CD with Docker: "Continuous Integration (CI) and Continuous Delivery (CD) methodologies are key traits of a modern software development practice. Docker Enterprise Edition (Docker EE) can be a catalyst for this DevOps mindset, integrating with your preferred tools and existing practices to improve the quality and speed at which innovation is delivered." Learn More About Docker & Gitlab CI/CD.

Tata Communications: Containers-as-a-Service (CaaS) in IT Outsourcing Companies

The Tata Communications Container-as-a-Service (CaaS) platform allows systems administrators to balance project management agility through the use of DevOps command line tools. The Tata CaaS platform contains all of the required software subscriptions & manual dependencies for microservice scripts to run in containerized virtualization. The Tata CaaS platform includes a Secure Image Registry for web server disk image snapshot provisioning of hardware in elastic clusters. The software suite includes Role Based Access on vertical clusters running in parallel for multi-team Agile developers with many supported products to practice CI/CD versioning and sandbox testing. Container security is managed through user access permissions on each environment individually for private cloud control. Learn More about Tata Communications Container-as-a-Service (CaaS) Platform.

Foundation Cloud Build & BootStack: "Canonical will design, build and operate a production OpenStack cloud on your premises from $15 per server per day. We’ll train and transfer control to your team on request. This is the recommended way to get to production fast, enabling you to ramp your workloads efficiently and focus on cloud consumption not infrastructure... Use Canonical’s conjure-up to deploy a multi-node OpenStack. With LXD containers, it all fits on your laptop, in VMware or even a public cloud. Ideal for first-time users or developers who want to study the components of OpenStack in action. Use it with MAAS (‘Metal as a Service’) to deploy across many servers and scale it to many racks." Learn More About Ubuntu OpenStack.

With container hosting, businesses can choose a Kubernetes plan at a major cloud host like AWS, GCP, or Microsoft Azure vs. installing an OpenStack network on AWS EC2 independently or in a private cloud data center. The main advantage of the major Linux distros based on OpenStack (SUSE, Red Hat, & Ubuntu) is that the platform security is advance tested by experts and the packages offer an ease of deployment as a solution at scale, with included customer support, training manuals, administration panels, and network monitoring utilities. Cloud installs of OpenStack allow one server to be designated as a host which controls the cluster networks using virtualized container partitions. The main alternative to this is private cloud with OpenStack & Kubernetes implemented for data center operations. These methods are quite popular in web hosting for running VPS platforms and shared CentOS plans together on the same data center hardware, or across multiple data center locations internationally. The main alternatives to OpenStack, Kubernetes, & Docker as open source solutions are Mesosphere DC/OS, CoreOS Tectonic, & CloudStack. The main advantage of these platforms are that they allow for better data center hardware resource allocation in production through container virtualization, as well as supporting web server deployments at scale with custom code and programming language extension requirements that web/mobile apps depend on to operate major SaaS features to the public. Apache Hadoop provides tools for a next-generation of apps based on data streams, for example to power Amazon.com user analytics or self-driving car networks.

Tata Communications Containers-as-a-Service (CaaS) Platform:

IZO™ Private Cloud: "Integrate, manage and control your distributed IT environments using a single orchestration platform. Our open and flexible platform gives you a choice of hypervisors, operating systems and storage. Our dedicated or virtual integrated security architectures are configurable using Roll-Based Access Control (RBAC) and Lightweight Directory Access Protocol (LDAP) integration." Learn More About Tata Communications.

To learn more about OpenStack, consider the ebook from Mirantis: "OpenStack: The Path to Cloud" (2017). Otherwise, VMware made some major announcements this week about working in partnership with IBM on their cloud services platform, as well with remote hosted vSphere installations on AWS for large scale business solutions. Also trending in "Big Data" platform news is Apache HiveMall, which allows for programmers to build Machine Learning apps with SQL databases. JFrog has published a demo on how programmers are using C and C++ with Jenkins, Conan, and the JFrog Artifactory to provision web servers for complex web/mobile app support in hosting environments. Brigade has released a new event scripting app for Kubernetes.

Apache Hadoop:"The Apache™ Hadoop® project develops open-source software for reliable, scalable, distributed computing. The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures. Learn More About Apache Hadoop.

Microsoft announced API support for developers to use their Custom Vision, Face API, and Bing Entity Search service in building web/mobile applications. According to ZDnet:

"These APIs and services are all part of the 25-plus Cognitive Services on which Microsoft is working. Last May, Microsoft officials said 568,000 developers were using its Cognitive Services to add AI smarts to their own apps and services. This week, officials said that more than one million developers have signed up and used Microsoft's Cognitive Services."

CMSwire writes that 80% of businesses are investing in AI and that this area of IT will see exponential growth over the next 10-20 years. Primarily, there is the huge need for AI/ML/DL processing in autonomous vehicle and self-driving car networks, which will be managed at continental scale by render-AI server banks, cloud web hosts, & IT Majors. Secondly, the same hardware can be used in Bitcoin, Blockchain, and virtual currency mining, as well as gaming & VR/AR apps, i.e. GPUs & machine learning TPUs. Google, AWS, and IBM already offer developers access to TensorFlow servers on the web hosting model, which can be integrated into apps and business services through APIs and custom software development. Mobile and tablet device uptake also change what programmers are able to do with these tools in creative or disruptive ways. Quantum supremacy, 3-D metal printing, & Higgs boson research are some examples of the most advanced applications using AI/ML/DL servers in super-clusters today.

"Big Data" & AI/Ml/DL: 'Circling back to the Kaggle acquisition; the third leg of the Google strategy stool rests on grooming a community of data scientists and machine learning experts (future and present) who are used to and comfortable inside Google's machine learning ecosystem. Whether it is the extraordinary support Google is giving the Tensorflow project (most recently incorporating deep learning library Keras) or the free education in the form of joint courses on machine learning and deep learning with Udacity or now the acquisition of half a million machine learning enthusiasts through Kaggle, these moves ensure that the practitioners toolbox will be based on Google standards and technology. The first order of business after the acquisition it seems is to move Kaggle's Kernels "a combination of environment, input, code, and output" over to the Google Cloud.' Learn More About Google's Kaggle Acquisition.

Anyone looking to learn more about data science and machine learning can read:

There are a large number of excellent resources available online which can lead experienced programmers to get started with TensorFlow quickly using a the language they are already familiar with technically. Kaggle data science research is well respected academically & professionally worldwide, with integrated hiring and recruitment tools for specialists.

Microsoft & the Open Compute Project: New Standards for SSD Storage Firmware

Project Denali: "Project Denali is a standardization and evolution of Open Channel that defines the roles of SSD vs. that of the host in a standard interface. Media management, error correction, mapping of bad blocks and other functionality specific to the flash generation stays on the device while the host receives random writes, transmits streams of sequential writes, maintains the address map, and performs garbage collection. Denali allows for support of FPGAs (field programmable gate arrays) or microcontrollers on the host side." Learn More About the Open Compute Project.

Open Compute Project (OCP): Project Denali, Project Cerberus, & Project Olympus 

The OpenBMC Project & Linux Foundation continued to work on establishing industry standards for open source baseboard management controller (BMC) firmware. Microsoft announced Project Denali for standardizing SSD firmware interfaces as well as the donation of Project Cerberus, a cryptographic microcontroller, to the Open Compute Project (OCP) for further collaborative development. According to ZDnet:

"Microsoft officials last year described Cerberus as the next phase of Project Olympus, its datacenter server design which the company contributed to the OCP... Microsoft joined the Open Compute Project (OCP) in 2014, and is a founding member of and contributor to the organization's Switch Abstraction Interface (SAI) project. The OCP publishes open hardware designs intended to be used to build datacenters relatively cheaply. The OCP has already released specifications for motherboards, chipsets, cabling, and common sockets, connectors, and open networking and switches."

Data centers can continue to monitor developments from the Open Compute Project (OCP) to track standards changes in web server and networking equipment. Software developers will need to monitor these changes for support of the industry standards in applications. SSD storage & data encryption remain two of the most rapidly innovating sectors of cloud web hosting.

Author:
Eliran Ouzan is the Co. Founder and designer of HostAdvice and also owns Moonshot Marketing LTD, a leading web design & development firm and was a member at GreenPeace.

Widely known for his pixel-perfect and high conversion rate web designs. Over the course of his web experience he experimeneted with over 200 web hosting companies and have a superior knowledge on what defines a good hosting company.

Share this post

"Cloud Container News: Ubuntu OpenStack, Tata Communications, JFrog, Open BMC, & Kaggle"