Tag Archives: Cloud
OpenStack is getting bigger than ever. It now powers more than 75 public cloud data centers and thousands of private clouds at a scale of more than 10 million compute cores. But it’s always been hard to upgrade from one version of OpenStack to another, and it’s been hard to deploy on bare metals. With OpenStack 18, Rocky, both problems are much easier to deal with now.
The open-source OpenStack cloud, like its ancestors, has always run well on diverse hardware architectures — bare metal, virtual machines (VMs), graphics processing units (GPUs), and containers. Bare metal was always a bit tricky. OpenStack Ironic, its bare metal provisioning module, is bringing more sophisticated management and automation capabilities to bare metal infrastructure. Nova, which provisions compute instances, now supports creating both virtual machines (VM)s and bare metal servers. This means it also supports multi tenancy, so users can manage physical infrastructure in the same way they manage VMs.
Other new Ironic features include:
- User-managed BIOS settings: BIOS (basic input output system) performs hardware initialization and has many configuration options that support a variety of use cases when customized. Options can help users gain performance, configure power management options, or enable technologies like single root input/output virtualization (SR-IOV) or Data Plane Development Kit (DPDK). Ironic also enables users to manage BIOS settings, supporting use cases like Network Functions Virtualization (NFV) and giving users more flexibility.
- Conductor groups: In Ironic, the “conductor” is what uses drivers to execute operations on the hardware. Ironic has introduced the “conductor_group” property, which can be used to restrict what nodes a particular conductor (or conductors) have control over. This allows users to isolate nodes based on physical location, reducing network hops for increased security and performance.
- RAM Disk deployment interface: A new interface in Ironic for diskless deployments. This is seen in large-scale and high performance computing (HPC) use cases when operators desire fully ephemeral instances for rapidly standing up a large-scale environment.
Julia Kreger, Red Hat principal software engineer and OpenStack Ironic project team lead, said in a statement, “OpenStack Ironic provides bare metal cloud services, bringing the automation and speed of provisioning normally associated with virtual machines to physical servers. This powerful foundation lets you run VMs and containers in one infrastructure platform, and that’s what operators are looking for.”
This isn’t just theory. It works. And it heading into production.
James Penick, Oath’s IaaS architect (Oath is AOL and Yahoo’s parent company), said Oath is already using OpenStack to manage “hundreds of thousands of bare metal compute resources in our data centers.” He added, “We have made significant changes to our supply chain process using OpenStack, fulfilling common bare metal quota requests within minutes.”
That’s good, but it’s not good enough.
“We’re looking forward to deploying the Rocky release to take advantage of its numerous enhancements such as BIOS management, which will further streamline how we maintain, manage and deploy our infrastructure,” Penick said.
Also: How to install OpenStack on Ubuntu Server with Devstack TechRepublic
That’s great, but many OpenStack users are already saying, “Maybe I’ll install this in 2021.”
Upgrading OpenStack isn’t easy. But OpenStack Rocky’s Fast Forward Upgrade (FFU) feature is ready for prime time, and it’s all set to help users overcome upgrade hurdles and get on newer releases of OpenStack faster. Now, FFU lets a OpenStack on OpenStack (TripleO) user on Release “N”, and they can quickly speed through intermediary releases to get on Release “N+3” (the current iteration of FFU being the Newton release to Queens). You can’t jump all the way to Rocky, but you can a lot closer to it more quickly than you ever could before.
Other new features are:
- Cyborg provides lifecycle management for accelerators like GPUs, FPGA, DPDK, and SSDs. In Rocky, Cyborg introduces a new REST API for FPGAs. These floating point chips are used machine learning, image recognition, and other HPC use cases. This enables users to dynamically change the functions loaded on an FPGA device.
- Qinling is introduced in Rocky. Qinling (“CHEEN – LEENG”), a function-as-a-service (FaaS) project. This delivers serverless capabilities on top of OpenStack clouds. It also enables developers to run functions on OpenStack clouds without managing servers, VMs or containers — while still connecting to other OpenStack services like Keystone.
- Masakari, which supports high availability by providing automatic recovery from failures, expands its monitoring capabilities to include internal failures in an instance, such as a hung OS, data corruption, or a scheduling failure.
- Octavia, the load balancing project, adds support for UDP (user datagram protocol). This brings load balancing to edge and IoT use cases.
- Magnum, a project that makes container orchestration engines and their resources first-class resources in OpenStack, has become a Certified Kubernetes installer. This makes it easier to deploy Kubernetes on OpenStack.
Want to check the new OpenStack out? You can download Rocky today.
The cloud has come to the health care sector, and it’s having an impact by saving some money. However, that’s not the real value of cloud computing for this sector, a sector that affects us personally at some point in our lives.
Black Book Research found that 93 percent of hospital CIOs are actively acquiring the staff to configure, manage, and support a HIPAA-compliant cloud infrastructure. Also, 91 percent of CIOs in the Black Book survey report that cloud computing provides more agility and better patient care with the proliferation of health care data.
But there is a huge innovation gap when it comes to health care and cloud computing between what’s possible versus what is actually being done. Take patient data, for example. Most health care organizations, providers, and payers don’t make many moves toward better and more proactive management of patient data unless regulations move them along.
This isn’t about operational and billing data, or electronic health records (EHRs). If health care systems abstracted information in certain ways, both the doctor and patient would have better insights into the patients’ health, preventive care, and treatment.
The cloud services that support these innovative functions are now dirt-cheap. As hospitals become cloud-enabled, it’s time to start moving faster toward the complete automation of care, treatments, and analyses of patient health. Let’s move from a system that’s largely reactive to a system that’s completely proactive.
Of course, there are islands of innovation in the health care sector. But it’s still mostly on the R&D side of things and has yet to trickle down to direct patient care. The potential here is greater than in any other sector I’ve seen. Just consider the telemetry information gathered from smart watches and cellphones and the ability to funnel all data though deep learning-enabled systems that cost pennies an hour to run on the cloud.
Now that we have the tools, there is little excuse not to innovate beyond what’s been done already. We’re better than this.
(Reuters) – Retail giant Walmart Inc said on Tuesday it entered into a strategic partnership with Microsoft Corp for wider use of cloud and artificial intelligence technology, in a sign of major rivals of Amazon.com Inc coming together.
The five-year agreement will leverage the full range of Microsoft’s cloud solutions, including Microsoft Azure and Microsoft 365, to make shopping faster and easier for customers, the Bentonville Arkansas-based company said.
As part of the partnership, Walmart and Microsoft engineers will collaborate to migrate a significant portion of walmart.com and samsclub.com to Azure, Walmart added.
While Walmart is doubling down on its e-commerce presence to better compete with Amazon, Microsoft has been working on a technology that would eliminate cashiers and checkout lines from stores, Reuters reported last month.
Microsoft’s technology aims to help retailers keep pace with Amazon Go, the ecommerce giant’s highly automated store format.
The Windows software maker has also shown the sample technology to retailers from around the world and has had talks with Walmart about a potential collaboration, Reuters reported.
Through the partnership, Walmart plans to defend itself from Amazon’s retail ambitions and expertise in data, and boost its online presence.
Reporting by Rishika Chatterjee in Bengaluru; Editing by Gopakumar Warrier
- Netflix will invest $ 1.3B in technology this year alone according to a recent interview with founder and CEO Reed Hastings.
- Netflix is spending $ 8B on content production and licensing this year, with the goal of achieving 1,000 original releases in 2018.
- Netflix has been nominated for a record 112 Emmy Awards this year, breaking HBO’s 17-year streak at the top.
- For their latest fiscal quarter ending March 31 of this year, Netflix reported a revenue increase of 40% to $ 3.7B. International streaming increased 70% to $ 1.78B, and domestic streaming increased 24% to $ 1.82B,
- In their latest quarter, Netflix reported faster-than-forecast subscriber growth both internationally (5.46M net new subscribers versus guidance of 4.9M) and in the U.S. (1.96M net new subscribers, versus guidance of 1.45M).
- Netflix continues to expand its streaming base, ending their latest quarter with more than 118.9M global paid subscribers, up from 94.36M a year ago.
- By the end of March 2018, Netflix had reached 125M worldwide subscribers, 57M in the U.S. alone, in addition to having subscribers in 190 countries.
Netflix’s exponential growth this year is attributable in part to the cloud platform decisions made years ago that enable their subscription-based business model to scale globally securely. Last year at Amazon’s AWS re:Invent 2017 Conference, Greg Peters, Chief Product Officer of Netflix provided insights into how closely Netflix and AWS work together to create innovative new services based on AWS’ advances in machine learning-powered security, developer apps, and scalability. It’s an insightful session into how Netflix is relying on Amazon to do the heavy lifting of infrastructure development and can be viewed here, AWS re:Invent 2017 – Fireside Chat: Steve Schmidt, Jenny Brinkley, and Greg Peters of Netflix.
AWS and Netflix development teams are using machine learning-powered security to analyze data access patterns and look for anomalous account activity. The discussion includes many of the foundational concepts of Next-Gen Access (NGA) that is foundational to attaining Zero Trust Security (ZTS) across an enterprises’ IT infrastructure. AWS and Netflix are looking at how to capture the myriad of data points each access point to their subscription service enables daily and assess the risk of a breach in real-time, much like what Centrify is doing today. Greg Peters also defines scale as the ability to accommodate a growing, diverse base of developers with a paved path network that enables them to create and innovate quickly. Netflix has a strong DevOps culture where engineers have the freedom to spin up a new AWS instance to try out new ideas in seconds without having to wait for IT to approve them.
The following are ten charts that illustrate Netflix’s rapid growth as a cloud-based subscription business:
- 27% of Americans prefer Netflix over any other platform, including basic cable and broadcast TV according to a recent survey by investment banking firm Cowen & Company. Netflix’ popularity is soaring with Americans in the 18 – 34 age group with 39.7% naming Netflix as their favorite TV platform. The following illustrates just how dominant Netflix has become the TV platform of choice. Scaling to this level of popularity is possible in part because of the decision to standardize on a single cloud platform and work to have Netflix-specific features including on the AWS roadmap. Source: Netflix Is Americans’ Platform of Choice for TV Content, Statista, July 5, 2018.
- Netflix’s latest quarterly revenue of $ 3.7B is evenly distributed between domestic and international streaming, earned from 118.9M global subscribers as of March 31rst of this year. Q1 2018 revenue is evenly distributed between Domestic Streaming (49%) and International (49%). David Goldstein’s excellent graphic below provides a succinct analysis of the Netflix Income Statement for Q1, 2018 and a profile of subscriber levels over time. Source: Netflix Strong Q1 for Revenues, Profits, and Members by David Goldstein on April 19, 2018. Mekko Graphics.
- Netflix dominates the U.S. video-on-demand (VoD) market with 77% of all VoD services subscribers. With a 21% lead on Amazon, Netflix has market momentum in the U.S. where the strategy of creating more original content is paying off with subscriber growth and a greater variety of content density than their many competitors. Source: Statista Global Consumer Survey, 2018
- 43% of all U.S. VoD users subscribe to both Netflix and Amazon Video. Subscribing to multiple services is common with U.S. VoD users with 83% subscribing to more than one service. Nearly 1 in 3 U.S. VoD subscribers (29%) are subscribing to five or more services. While so many subscription-based businesses struggle to gain customers and minimize churn, Netflix has devised an aggressive strategy of making their subscription, ad-free model succeed. Reed Hastings, CEO, credits the intensity of effort and focus they are putting on creating exceptional, high-value content that attracts new subscribers and makes them loyal. A video clip of a recent interview with him and other members of the senior management team is here. Source: Statista Global Consumer Survey, 2018
- Netflix is projected to have over 114M households subscribing online by 2020. Netflix is growing its global household subscriber based at 8.96% Compound Annual Growth Rate (CAGR), increasing from 81.52M households in 2016 to over 114M in 2020. Localized Netflix-produced content globally is growing faster than senior management originally anticipated, with 3%, a Brazilian science fiction (sci-fi) series produced in Portuguese being an example of one of the original content projects doing exceptionally well in 2018. Sources: Netflix Investor Relations and Digital TV Research.
- The Asia-Pacific subscriber base is projected to grow at an 18.47% CAGR through 2023, making it the fastest growing region globally. Western Europe is also forecast to gain subscribers, increasing by16 million between 2018 and 2023. Latin America, where Netflix is enjoying success with originally produced content that is being well-received globally, is predicted to gain 8 million subscribers in five years. Sources: Netflix Investor Relations and Digital TV Research.
- By 2020 Netflix’s streaming business in the U.S. alone is projected to deliver over $ 7B in revenue. From 2018 to 2020, streaming revenues are projected to grow at a CAGR of 8%, jumping from $ 5.4B in 2017 to $ 7.2B in 2020. Between 2011 and 2014, Netflix more than doubled streaming revenues from U.S.-based subscribers jumping from $ 1.6B to $ 3.4B. Sources: Digital TV Research, Netflix Investor Relations, and Nakono.
- While price is the most appealing feature for 56% of respondents to recent Tivo/Fiercecable survey, members having the flexibility of creating their profiles (52.9%) increases content consumption across all devices. Speaking from experience in a household where there are five separate Netflix accounts, each person having the opportunity to personalize their content preferences is a major advantage of NetFlix over other streaming services. Search is the third favorite feature and autoplay fourth with 43.4% of respondents selecting this feature. Multiple responses were allowed to this question. Source: TiVo & Fiercecable study completed December 2017
- Netflix’ content strategy is paying off with strong levels of loyalty across all age groups, including the 50 – 64-year-old segment who often perceive TV as long-standing broadcast ad-based networks. Netflix’s cloud strategy has made it possible to immediately scale their original content across national and regional markets immediately, as is the case with their sci-fi series 3% which is produced in Portuguese for the Brazilian market. Netflix’ senior management has found a strong reception for 3% across other nations as well. Their cloud platform makes it possible to scale this and other series globally in real-time, outrunning competitors who have not invested so heavily into a scalable, secure cloud platform. Source: TiVo & Fiercecable study completed December 2017
Louis Columbus is an enterprise software strategist with expertise in analytics, cloud computing, CPQ, Customer Relationship Management (CRM), e-commerce and Enterprise Resource Planning (ERP).
Where do I get my cloud news? It’s almost never CPA Journal. But, more and more, accounting is becoming a larger part of cloud computing—no matter what side of the cloud you’re sitting on.
On the enterprise side, it’s a matter of taxes to be paid. While you can typically find 30 to 40 percent better operational cost utilization when using cloud computing, that savings may be diluted by the fact that you’re giving up depreciation on hardware in the datacenter.
So, while cloud computing can save you millions of dollars a year, it may actually cost you money, at least in the short term. That’s something that I’ve run into from time to time with clients over the years.
At issue is that you need to consider net savings. That mean looking for the all-in cost of the cloud, including dealing with tax and other accounting implications.
Although cloud computing is typically a superior model, walking away from traditional hardware and software has a cost as well. Indeed, in a few cases I’ve found that a cloud computing solution that will save $ 10 million a year actually will cost $ 15 million considering the impact of taxes. The gross savings made sense for cloud, but the net savings did not.
So, how are cloud geeks supposed to deal with these accounting issues? By using business analysts to work up cloud ROI models. It’s not uncommon for these business analysts to be CPAs.
Even more complex is the fact that most companies are multinational these days, and so you to figure out not only the net cost impact for a single country, but for dozens of countries that have some pretty odd laws when it comes to accounting, especially tax issues. At this point, the ROI models become pretty complex.
But you don’t have to cede everything to the CPAs and lawyers. The good news is that current IT cost-governance tools for cloud computing do indeed consider other net cost issues. So you’ll actually see a truer cost of using a cloud service, versus just the cost of the cloud services—and for the operational life, not just the upfront ROI analysis.
Although it adds complexity to the cloud migration path, accounting is just a fact of life in business.
Who knows? Perhaps one day we’ll see this as a specialty in accounting. (Umm, I hope not!)
Do you have dysfunctional enterprise data? The symptoms are pretty easy to spot, including not having a single source of truth for customers, orders, inventory, etc. Or not be able to properly secure and govern the data, thus being unable to deal with regulations.
Many enterprises are taking their dysfunctional data to the cloud, and thus their limits and problems. But you don’t have to perpetuate that dysfunction in the cloud. Here are some ways to fix that dysfunctional data when moving to the cloud.
Option 1: Fix the data in flight to the cloud
The mood around tech is dark these days. Social networks are a cesspool of harassment and lies. On-demand firms are producing a bleak economy of gig labor. AI learns to be racist. Is there anyplace where the tech news is radiant with old-fashioned optimism? Where good cheer abounds?
Why, yes, there is: clean energy. It is, in effect, the new Silicon Valley—filled with giddy, breathtaking ingenuity and flat-out good news.
This might seem surprising given the climate-change denialism in Washington. But consider, first, residential solar energy. The price of panels has plummeted in the past decade and is projected to drop another 30 percent by 2022. Why? Clever engineering breakthroughs, like the use of diamond wire to slice silicon wafers into ever-skinnier slabs, producing higher yields with less raw material.
Manufacturing costs are down. According to US government projections, the fastest-growing occupation of the next 10 years will be solar voltaic installer. And you know who switched to solar power last year, because it was so cheap? The Kentucky Coal Museum.
Tech may have served up Nazis in social media streams, but, hey, it’s also creating microgrids—a locavore equivalent for the solar set. One of these efforts is Brooklyn-based LO3 Energy, a company that makes a paperback-sized device and software that lets owners of solar-equipped homes sell energy to their neighbors—verifying the transactions using the blockchain, to boot. LO3 is testing its system in 60 homes on its Brooklyn grid and hundreds more in other areas.
“Buy energy and you’re buying from your community,” LO3 founder Lawrence Orsini tells me. His chipsets can also connect to smart appliances, so you could save money by letting his system cycle down your devices when the network is low on power. The company uses internet logic—smart devices that talk to each other over a dumb network—to optimize power consumption on the fly, making local clean energy ever more viable.
But wait, doesn’t blockchain number-crunching use so much electricity it generates wasteful heat? It does. So Orsini invented DareHenry, a rack crammed with six GPUs; while it processes math, phase-changing goo absorbs the outbound heat and uses it to warm a house. Blockchain cogeneration, people! DareHenry is 4 feet of gorgeous, Victorianesque steampunk aluminum—so lovely you’d want one to show off to guests.
Solar and blockchain are only the tip of clean tech. Within a few years, we’ll likely see the first home fuel-cell systems, which convert natural gas to electricity. Such systems are “about 80 percent efficient,” marvels Garry Golden, a futurist who has studied clean energy. (He’s also on LO3’s grid, with the rest of his block.)
The point is, clean energy has a utopian spirit that reminds me of the early days of personal computers. The pioneers of the 1970s were crazy hackers, hell-bent on making machines cheap enough for the masses. Everyone thought they were nuts, or small potatoes—yet they revolutionized communication. When I look at Orsini’s blockchain-based energy-trading routers, I see the Altair. And there are oodles more inventors like him.
Mind you, early Silicon Valley had something crucial that clean energy now does not: massive federal government support. The military bought tons of microchips, helping to scale up computing. Trump’s band of climate deniers aren’t likely to be buyers of first resort for clean energy, but states can do a lot. California already has, for instance, by creating quotas for renewables. So even if you can’t afford this stuff yourself, you should pressure state and local officials to ramp up their solar energy use. It’ll give us all a boost of much-needed cheer.
Write to firstname.lastname@example.org.
This article appears in the January issue. Subscribe now.
Established in 2009, Azzaron provides cloud computing solutions including Desktop as a Service (DaaS), Infrastructure as a Service (IaaS) and a …