Forum network switch upgrade

This is the final report for Project 442 - Forum network switch upgrade


1. Introduction

The aim of this project was to replace all existing HP Procurve 2610-48 10/100 Mb/s switches (plus a couple of 2620-48 switches) on the Informatics Forum data network with more modern Aruba 2530-48G 1Gb/s switches. The total number of switches to be replaced was 55; the number of network closets involved was 17.

2. Outcome of the project

All existing HP Procurve 2610-48 10/100 Mb/s switches were replaced; the Forum edge data network is now composed entirely of 1Gb/s switches.

3. Discussion

This project was a joint effort between several members of our Computing staff, and several members of our Technical staff. There are generally no technical 'gotchas' in a project like this, but there is significant planning involved, a great deal of pre-configuration to be done, and the entire thing then relies on coordinating the efforts of several members of staff, all of of whom of course have other constraints on their time.

The switch replacements done in the course of this work were not literally like-for-like: in particular, a complication was that the new 2530-48G switches have fewer available uplink ports than do the older 2610-48 switches. (See the planning wiki page for the detail on that.) Dealing with this required some repatching of every existing 2610-48 switch before work could commence on the upgrade itself.

The work of configuring the 55 new switches in advance of their final installation was jointly shared by ID and JO. We found that, once familiarity had been gained, we could expect to completely configure one switch in something like 30 minutes. (The work of 'configuration' covers the entire process of unboxing the switch; physically setting it up; connecting it to our network and logically setting it up - including upgrading the firmware; and then finally reboxing it for transport to its destination closet.) But this is hypnotic work and, for similar efforts in future, one might want to budget for configuring no more than about six switches in any one day.

It would have been possible to parallelize the configuration work by using multiple switch installation LCFG profiles and IP addresses (we shared a single one of each), but a limiting factor is physical space: we don't have a dedicated server preparation area, so the most convenient place for the work was the Forum server room itself - and setup room in there is limited. So in this project we dealt with the configuration work sequentially.

Our Technical staff are very busy: dealing with building issues, and dealing with the consequences of the continual upgrade works to the Forum (e.g. the '3 for 2' conversions) eats up their time. We finally managed to coordinate time with the Techs to schedule the installation of the new switches in the closets starting in about February 2018, and we are very grateful for their help; however, had the Techs been less inundated with work, we could have completed this network upgrade sooner (we took physical delivery of the new switches involved in mid-2017.)

It would be possible for computing staff to do the job of installing/replacing switches in network closets - but we do not recommend it: the Techs have well-proven and slick systems for this kind of work, and it best left to them. At the start of the replacement program, our technicians were budgeting a morning's work for two people per closet. By the end of the work, they were able to upgrade up to three closets in a single morning.

We configure the switches so that they work as required as soon as they are installed and switched on, but there is additional 'house-keeping' work which needs to be done by us post-installation: .pc files need to adjusted to remove redundant links; Nagios monitoring needs to be re-enabled; network diagrams need to be brought up-to-date; etc. All this takes additional time.

4. Lessons learned

  1. Do not overpopulate switches. The fact that many of the old 2610-48 switches in this case were using all or most of the ports 1 to 48 for user connections caused a lot of extra work in this project: those ports had to be freed up for future switch interlinks before work proper could commence. If closets are starting to look full, arrange to buy and install new switches in plenty of time.
  2. When designing and building server rooms, allocate distinct space for formal preparation areas where incoming equipment can be stored, where switches and machines can be unboxed and set up as necessary, where rubbish and outgoing equipment can be kept separate from the actual server area, etc. If such areas are environmentally more pleasant for humans to work in than are the server rooms themselves, then so much the better.
  3. Hire more technicians.

5. Effort

C(S)O effort: approx 15 person days

Technical staff effort: 7 two-person mornings => 7 person days

6. References

  1. Project entry on
  2. Project home page
  3. Wiki page for planning, organization and progress tracking
  4. Computing systems blog announcement

-- IanDurkacz - 22 Jun 2018

This topic: DICE > WebHome > DevelopmentMeeting > FinalProjectReports > FinalProjectReport-442
Topic revision: r3 - 25 Jun 2018 - 12:03:17 - IanDurkacz
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback
This Wiki uses Cookies