Finding the Sweet Spot When It Comes to Your Server Refresh Cycle

first_imgNothing lasts forever. Despite the rumors, even Twinkies have a limited shelf life.Which is why the server refresh cycle is so important for organizations today. Servers don’t last forever, and waiting too long to replace can result in downtime and put your core business functions at risk. But on the flip side, if you refresh too soon and for the wrong reasons, it could be a costly decision that eats up most of your IT budget.So How Do You Find That Server Refresh “Sweet Spot”? When it comes to server refresh, there are plenty of factors to consider. Cost, frequently run applications, IT staff, current infrastructure, growth objectives, and your plans for emerging workloads all come into play. Unfortunately, with a server refresh, there is no magical, one-size-fits-all answer. The best time to refresh your servers is based on your organization’s unique needs and long-term goals. There are obvious costs associated with modernizing your on-premise infrastructure. But there are also substantial costs to NOT doing it. By continuing to run legacy hardware, you could be putting your organization at risk.In the past, the average server refresh cycle was about 5 years. But that timeline has shifted. Today, it’s not uncommon for businesses to refresh on a 3-year cycle to keep up with modern technology. These companies aren’t just refreshing for the fun of it (although we agree that new servers and data center toys ARE exciting) – they’re doing so to meet increasing demands and strategically position themselves to handle new innovations of the future. They know they need to modernize to remain competitive and prepare for the new technologies.Benefits of a Server RefreshModern servers are made specifically to handle emerging workloads. For example, the PowerEdge MX7000 features a Dell EMC kinetic infrastructure, which means that shared pools of disaggregated compute, storage, and fabric resources can be configured – and then reconfigured – to specific workload needs and requirements.In addition to handling data-intense workloads, replacing servers and other critical hardware reduces downtime and greatly reduces the risk of server failure. Improved reliability means that your IT staff spends less time on routine maintenance, freeing them up to focus on things that add value to the business.Additionally, newer servers provide greater flexibility and give you the opportunity to scale as needed based on changing demands. Some workloads, especially mission-critical applications, are best run on-premises, and a modernized infrastructure makes it easier to adapt and deploy new applications. A recent study by Forrester found that Modernized firms are more than twice as likely as Aging firms to cite faster application updates and improved infrastructure scalability.[1]Modernized servers also enable you to virtualize. By layering software capabilities over hardware, you can create a data center where all the hardware is virtualized and controlled through software. This helps improve traditional server utilization (which is typically less than 15% of capacity without virtualization).A server refresh presents a tremendous opportunity to improve your IT capabilities. New servers help you to remain competitive and position you for future data growth, innovative technologies, and demanding workloads that require systems integration.For more information about the benefits of server refresh, download the Forrester study Why Faster Refresh Cycles and Modern Infrastructure Management Are Critical to Business Success or contact a Dell EMC representative today. To learn more about PowerEdge, visit dellemc.com/servers, or join the conversation on Twitter @DellEMCservers.[1] A commissioned study conducted by Forrester Consulting on behalf of Dell EMC, “Why Faster Refresh Cycles and Modern Infrastructure Management Are Critical to Business Success,” May 2019.last_img read more

Read More →

Unlock Real-Time, GPU-Driven Insights With Azure Stack Hub

first_imgAs you may have seen, Microsoft announced the public preview of the GPU capabilities with Azure Stack Hub. What does this mean?  Well, GPU support in Azure Stack Hub unlocks a variety of new solution opportunities. For customers running training and inference workloads on Azure or looking to run applications on Azure N-Series virtual machines, this preview will bring those capabilities to Azure Stack Hub. Visualization is another targeted use case where customers are looking to leverage GPU capabilities to render large amounts of data on specific targets closer to where the data is generated.To address these scenarios, Dell Technologies, in collaboration with Microsoft, is excited to announce upcoming enhancements to our Dell EMC Integrated System for Microsoft Azure Stack Hub portfolio that will unlock valuable, actionable information derived from large on-premises data sets at the intelligent edge without sacrificing security using GPU-accelerated AI and ML capabilities.Our GPU configurations are based on the Dell EMC Integrated System for Microsoft Azure Stack Hub dense configuration platform powered by PowerEdge R840 rack servers and will include both NVIDIA V100 and AMD MI25 GPUs, in a 2U form factor. This will provide customers increased performance density and workload flexibility for the growing predictive analytics and AI/ML markets. Our joint customers will be able to choose the appropriate GPU for their workloads to enable Artificial Intelligence, training, inference and visualization scenarios.Following our stringent engineered approach, Dell Technologies goes far beyond considering GPUs as just additional hardware components in the Dell EMC Integrated System for Microsoft Azure Stack Hub portfolio. These new configurations, like all Dell EMC Integrated System for Azure Stack Hub offerings, also come with automated lifecycle management capabilities, streamlined operations, and exceptional support.Dell Technologies has a long history of co-engineering with Microsoft and these new enhancements further strengthen our joint portfolio across hyperconverged infrastructure and hybrid cloud solutions. By working together to deliver innovative services faster and more frequently, we can become a real partner of change for our customers in this Digital Transformation era.With these new GPU-based configurations at the preview stage, we look forward to working closely with our customers in partnership with Microsoft to understand their scenarios and developing the right GPU platform to ensure a successful outcome. If interested in sharing your interest and feedback, please contact us to speak with one of our engineering technologists.last_img read more

Read More →

Intelligent and Elastic Compute Will Drive Future Edge Innovations

first_imgHardware acceleration4G → 5G, programmable networksSD-WANWI-FI and cellular convergence (5G, CBRS) Future Edge Usage Models and Architectural Shifts:With the dramatic growth in data as well as edge devices, the current edge infrastructure doesn’t scale well for extreme collaboration environments in the future. The next generation usage models around real-time content sharing, gaming, AR/VR, autonomous vehicles, drones and robotics are driving highly collaborative environments where information will be stored, distributed and analyzed across the end devices and the network edge. For example, users watching a game or concert in a stadium want to share locally captured content in real-time with other users. Distributed consumers want to collaborate using online gaming combined with virtual reality (VR) capabilities. This will leverage peer-to-peer communication and embedded AR/VR capabilities in the client devices. Autonomous vehicles, drones and robotics will take the edge information processing and direct device-to-device communication to next level.The figures below show this evolution from today’s Edge-to-Cloud architecture to future Intelligent & Elastic Compute Architecture.Current Technology Enablers: Co-Author: Liam Quinn, SVP / Senior Fellow, Client Solutions Group, Dell TechnologiesImagine a framework where devices can share computing resources and services seamlessly with other devices and leverage programmable network capabilities to optimally analyze the data and deliver business outcomes. IoT, social media and new edge deployments enabled by 5G and AI/ML (Artificial Intelligence / Machine Learning) are driving cloud applications to move to distributed edge. Future usage models will require new architectures that enable highly collaborative data processing environment with intelligent and fluid sharing of compute resources and mobility of applications across devices. We believe new architectures and rapid industry innovations are required to enable seamless collaboration and data sharing at the edge.Current Edge Use Cases and Technology InnovationsCurrent edge use cases are moving data processing closer to end devices. For example, content distribution moves data from cloud to edge and leverages content delivery networks (CDN)  for caching of content closer to the end users. IoT and social media use cases lead to consuming and generating data at the edge. This data is analyzed at the end devices or an edge-cloud to deliver the right outcome. Edge-cloud may be on-prem, a co-location facility or offered “as-a-Service” by a service provider. The use cases and deployment scenarios vary across industry verticals that includes home automation, ADAS, smart factories, retail, oil and gas, agriculture and enterprises.Growth in number of IoT devices, cost of data transport and mission critical use cases drove this shift to move processing of data from cloud to edge. The enabling technologies are summarized below.High-Performance Client Devices: Client devices have CPU cores and hardware acceleration capabilities to process data locally and execute ML inferencing models. New embedded sensors and sensor fusion is driving continuous improvement in intelligence, location and context-aware capabilities. This combined with AI/ML is creating client/edge ready and agile applications.Network Edge-Cloud: Cloud applications and data processing is moving to network-edge and edge-clouds to process data closer to end devices at lower latency. Improvements in compute performance/watt is enabling increased processing capabilities. Data management control planes have an important role in decision on where optimized data processing will occur.AI/ML and Accelerators: AI/ML frameworks and low-power hardware accelerators are emerging to enable inferencing at the edge, while compute intensive operations of training are performed at the centralized cloud. These accelerators are embedded in smart end devices and network edge platforms. The trained ML model is delivered at the edge to enable inferencing close to the point of data generation.5G Cellular Network: The emerging 5G network enables high speed wireless network pipes and lower latency for diverse workloads, leveraging existing and new spectrum. Virtualization of RAN (vRAN) enables radio processing to shift from custom devices at cell towers to standard x86 servers with hardware accelerators for aggregated RAN processing. It enables performance scaling for large number of micro-cells, seamless mobility and billions of end-devices.Network Slicing: The growth in number of edge applications and distribution of processing between cloud and edge is driving next level of innovation in network-slicing capabilities. This enables connectivity with guaranteed SLA between an end-device and a backend application. The application may be running at Telco network edge or in the cloud.Programmable Networks: Programmable networks are emerging to enable seamless mobility of users and re-configuration of network slices. They also enable moving an application seamlessly from cloud to edge, along with the associated networking and security services.SD-WAN: Software Defined WAN has emerged to enable edge to cloud connections with multiple quality of service options. Data is distributed across these WAN connections to deliver optimal application performance.Wi-Fi and Cellular Convergence: Requirements around seamless end-user mobility is driving innovation in Wi-Fi and cellular convergence with next generation Wi-Fi and Private-LTE / CBRS (Citizen Broadband Radio Spectrum). New Dell Edge devices with embedded sensors and smart antenna/radio switching are key elements to enabling this seamless experience based on workload applications and usage models.Dell Technologies is innovating in each of the above areas and collaborating with Telcos, Service Providers and Cloud providers to deliver critical capabilities to end users. The edge and data center infrastructures are designed to scale with smart client devices, high performance PowerEdge servers, Hyper Converged Infrastructure (HCI), and dense GPU Platforms.Dell Technologies recently announced innovations that highlight these capabilities:PowerEdge XE2420 servers offer dense compute and robust security for edge deployments.Modular Data Center Micro 415 brings data center to far-reaching and rugged environments.Dell EMC iDRAC9 software brings remote access for a uniform, more secure server management experience from the edge to the core to the cloud.Dell EMC Streaming Data Platform stores and analyzes edge data.Network Slicing Capabilities enable software on client devices to configure network slices all the way from user applications to network edge or cloud for guaranteed performance. Future Technology Enablers:AI-enabled elastic computeDevices with system on chips modules, micro-AIMulti-tenant network slicingDecentralized/trusted compute and data fabricIntelligent arbitration across peer-to-peer devices The future architecture will build on the capabilities of current high performance client devices, AI/ML, and network edge, but will also require a new range of innovations to deliver highly intelligent devices, elastic compute environment and decentralized storage infrastructure that adapts to demands of next generation workloads. There are various industry efforts underway to drive these innovations and Dell Technologies is developing solutions for AI-driven Elastic Compute.Intelligent Client Devices: Highly integrated “system-on-chip” modules are emerging that will serve as the building blocks in client and edge devices including wearables, video surveillance, industrial and automotive systems. These ASICs enable highly intelligent devices with right amount of compute, memory, storage and AI/ML capabilities. System-on-chip modules are integrating micro-AI and micro-Accelerators with software frameworks to enable ease of AI application development. These silicon, power advancements and software frameworks will enable an environment where device capabilities will be treated as virtual. Think of an “elastic model” where devices and network collaborate to deliver the experience requested by an end user, and it leverages collective capabilities across peer devices and network edge.Decentralized Data Fabric: Storage architecture will become decentralized enabling the end users and edge clouds to contribute storage capacity. The data will be distributed across locations based on geo-awareness, performance, security and regulatory policies (e.g. GDPR).Multi-tenant and Trusted Compute: Trusted computing frameworks are emerging to enable optimal placement of application execution and associated data. Processor vendors are embedding security features (e.g. Intel SGX, AMD SEV, ARM TrustZones) to create a trusted execution environment for applications in multi-tenant environments.Data Center Disaggregation and Memory Fabrics: Enterprise infrastructure is evolving to a disaggregated composable architecture where in CPU, memory, IO, accelerators are disaggregated using a memory-based fabric. This enables a software defined infrastructure where in compute nodes are dynamically composed to adapt to the workload needs. Some technology innovations in this area are GenZ and CXL for future memory fabrics, persistent memory (PMEM) for high speed storage and high-bandwidth memory (HBM) for future embedded memory.vRAN and Dynamic Network Programmability: vRAN (Virtual RAN) and programmable networks will evolve to enable guaranteed SLA on peer-to-peer connections across devices and applications.AI/ML driven Resource Optimization: The streaming telemetry from client and enterprise infrastructure will leverage distributed analytics for real time reconfiguration of infrastructure and applications.These technology innovations will enable a grid-like elastic compute environment. The opportunity is to make intelligent use of distributed resources, network transports and mobile connectivity to enable a collaborative environment, where in device capabilities and applications are treated as virtual entities and peer-to-peer architectures enable elastic composition of services. AI/ML will serve as the centralized brain to orchestrate applications across secured and distributed resources. Companies that lead this innovation will succeed in the next generation of AI-driven elastic and decentralized edge. Dell Technologies is driving these innovations across its solution offerings, industry standardization efforts and engaging with partners to both innovate and integrate technologies in Dell platforms.last_img read more

Read More →

Giving Computers a Voice

first_imgHow HPC enables natural language processing—and what you can do with itNatural language processing (NLP) is a form of artificial intelligence (AI) that enables a computer to understand, interpret, and use spoken or written human language. For example, NLP can be used to translate communications from one language to another, convert voice to text and vice versa, and even give chatbots the ability to have human-like conversations that help people get quick responses to questions and concerns.The field of NLP has recently been transformed by the use of neural networks and deep learning, enabled by advances in High performance Computing (HPC). It’s now possible to build AIs that can interact with people more naturally than ever before. And first-movers are already incorporating NLP into a wide range of processes.To help you capitalize on this trend, Dell Technologies has an active research program focused on helping interested parties explore, develop and adopt natural language processing applications. This research is carried out by a data sciences team at the Dell Technologies HPC & AI Innovation Lab in Austin, Texas.For example, data scientists at the lab are working to solve key problems associated with translating from one human language to another. Our research indicates that training models for language-to-language translation could be scaled to an extreme level, allowing models to be trained at a much faster pace and at a much larger scale without breaking the current state of the art.In another example, the team undertook to convert text to a human-sounding voice. Our neural networks have been able to transform voice synthesis, replacing artificial sounding “robot speak” with smooth, natural voices. The scale-out parallelism and acceleration made possible by HPC systems at the lab is driving down the time to create these voice models from months to hours, turning the dream of conversational computers into a reality even faster.Natural language processing is a powerful tool for organizations to streamline their interactions with customers, employees, partners and others. To help you capitalize on this opportunity, researchers in the Dell Technologies HPC & AI Innovation Lab are working to advance the technologies and methodologies for the development of language-to-language translation and text-to-voice translation applications. We’re excited to share our learnings, insights and best practices for NLP using high performance computing and artificial intelligence.To learn more:Read the Proving the art of the possible with natural language processing white paper.Visit com/AI.Watch session replays and highlights at Dell Technologies World and attend Luke Wilson’s DTW On-Demand breakout session: HPC Gives Computers a Voice to learn how to have a conversation with your computer.Search for other DTW sessions in the online catalog by entering “analytics,” “artificial intelligence” or “HPC” in the right-side search field.last_img read more

Read More →

US boosting vaccine deliveries amid complaints of shortages

first_imgPresident Joe Biden says the U.S. is ramping up vaccine deliveries to hard-pressed states over the next three weeks and expects to provide enough doses to vaccinate 300 million Americans by the end of the summer or early fall. Biden is calling the push a “wartime effort.” He said Tuesday that his administration is working to buy an additional 100 million doses of each of the two approved coronavirus vaccines. And he acknowledged that states in recent weeks have been left guessing how much vaccine they will have from one week to the next. He called that “unacceptable” and said “lives are at stake.”last_img read more

Read More →

North Carolina stops issuing Confederate license plates

first_imgRALEIGH, N.C. (AP) — The North Carolina Division of Motor Vehicles says it will no longer issue specialty license plates featuring the Confederate battle flag. The StarNews of Wilmington reports the agency says removal of the license plate, issued to members of the Sons of Confederate Veterans organization, took effect Jan. 1. A statement from NCDMV says it will continue to recognize the North Carolina Division of Sons of Confederate Veterans as a civic organization entitled to a specialty plate, but the recognition doesn’t entitle it to dictate the contents of the government speech on that plate.last_img read more

Read More →

Czech PM in Hungary to discuss Russian, Chinese vaccines

first_imgBUDAPEST, Hungary (AP) — The Czech Republic could be on the way to become the next European Union member to seek a COVID-19 vaccine from outside the EU’s common procurement program. Czech Prime Minister Andrej Babis travelled to Hungary on Friday to consult with Hungarian authorities on their experiences with vaccines from Russia and China. The Czech leader, who has criticized the speed of vaccine delivery in Europe, said vaccines should not be a political question and their countries of origin should not determine whether they are used. Hungary has purchased vaccines from both Russia and China, the only EU country to do so. Next Wednesday, Babis plans to visit Serbia, which has also begun administering Russian and Chinese vaccines.last_img read more

Read More →