Pod and core infrastructure allows for flexible data center design
By Max BurkhalterAugust 27, 2014
Benefits for performance
The ability to scale data centers to meet user demand is an important priority for a company's IT department. Pod and core infrastructures minimize the difficulty of this process by allowing IT teams to simply add new pods or integrate hardware to boost performance of current pods. The use of a central network also eliminates the need for IT teams to navigate the hurdles of vendor lock-in. TechRepublic explains that pod hardware can be reprogrammed to better meet the performance needs of the company as long as they remain compatible with the central network.
Solutions in implementation
Several companies have developed creative applications of the pod and core design to maximize the performance of their network. Fastly, a content delivery network operating 17 locations across the globe, have used a pod and core setup to eliminate the need for dedicated load balancers. The company's network simply activates extra pods during hours of peak activity to mitigate latency problems. Egnyte, a fail-sharing firm that runs out of three data centers, runs separate pods to handle computing, storage and processing. Additional pods are run by the company to perform quality assurance tests or pilot new networking solutions.These design strategies highlight the possibilities available to IT teams through the core and pod architecture.
Perle's serial to Ethernet converters connect serial based equipment across an Ethernet network. The Perle IOLAN range of Console Servers, Device Servers and Terminal Servers feature built-in support for IPv6 along with a broad range of authentication methods and encryption technologies.