Home > News & Events >
CoreStation for AI
11 November 2024
An end to waiting for the right Datacentre/Cloud resources to run your projects…! How high-performance users can pool resources from their own desktops.
I think we can go back a step from the CoreStation AI Magic and set out how we’ve physically built CoreStation for AI – i.e. bringing together CoreStation CX6620, a proven remote workstation (4 nodes in a 2U chassis) with a HBA (Host Bus Adapter) inside each node to create a PCI switched fabric which, using Liqid CDI software, allows resources (GPUs, FPGAs, DPUs, storage, networking, etc, resources, all from different vendors) to be pooled and composed en masse, to any individual node.
It gives the individual user the capability to compose the required resources and run the job as an end user from their desktop (rather than having to submit the job and wait for on-prem, or cloud resources to be approved and allocated.)
IS THIS THE DAWN OF DEMOCRATISING AI?
Accelerator centric datacentre resources either on-prem or via a cloud combine specialist hardware and software, enabling power users to manage massive amounts of data at high speed. They are essential for various AI workloads including LLM training, RAG (Retrieval Augmented Generation), inferencing and AI-accelerated workloads such as CFD (Computational Fluid Dynamics) simulations. But bringing together these types of datacentre resources takes time, budget and inevitable buy-in from across the business. What if we told you there is a new solution that gives individual users the capability to pool resources from their own desktop? Introducing CoreStation for AI.
An innovative approach to AI tasks
We’ve already written about the need to deploy AI. Now let us look in more detail at the solution. The continued advancement of compute technology and flexible software solutions is changing the game for power users. Using our experience of building high-compute solutions and leveraging our strategic partners, we have developed CoreStation for AI, a purpose-built alternative to waiting for on-prem or cloud resources to be allocated to your project, as it brings the power to run modern complex workloads to your desktop.
Starting with the CoreStation CX6620, a proven remote workstation with four nodes in a 2U chassis, we’ve added a HBA (Host Bus Adapter) inside each node to create a PCI switched fabric. Using Liqid CDI software, this enables power users to pool and compose resources as they need, en masse, to any individual node. That means GPUs, FPGAs, DPUs, storage, networking, etc, resources, all from different vendors, can be available at the desktop. Let us look at the individual elements:
CoreStation CX6620
Part of our CoreStation range of remote workstations, the CX6620 delivers the power and performance you need for demanding workloads. It integrates all the necessary components – processing power, graphics, storage, and networking – into a single, compact chassis, minimising the need for diverse hardware platforms and simplifying IT management.
Leveraging Dell PowerEdge C-class hardware, CX6620 includes four independent nodes in a 2U chassis with shared power and cooling. This enables the deployment of dedicated workstations in just 0.5U rack space or up to eighty workstations in a modern datacenter rack.
Liqid CDI software
Each node in the CoreStation CX6620 includes a dedicated GPU for greater capabilities, but in building CoreStation for AI, we have added a HBA into each of the four nodes for the CoreStation for AI solution. This creates a PCIe switched fabric (connecting devices together to support multiple data transfers simultaneously), which allows us to add any PCI resource to any of the nodes, thus removing all hardware constraints. CoreStation for AI with CDI makes the Impossible Workstation a reality.
Using Liqid CDI (Composable Disaggregated Infrastructure) software, users can pick from a pool of additional, unallocated resources to build the optimum underlying hardware platform for each individual task. Essentially, the bare metal server has access to all hardware components, as required, without physically building a cluster. By dynamically allocating GPUs and other resources to individual workstations, CoreStation for AI opens new possibilities at the desktop. For greater flexibility, greater scalability, and all secure by design.
Want to know more?
Discover more about CoreStation for AI on our dedicated website for our CoreStation range of products: AI | CoreStation’. Alternatively, you can contact our global offices here.