Final Project Report for Project 164 - Prometheus AFS PTS Conduit


The initial goal of this project was to produce the conduit, datastore and associated code which would allow the AFS PTS database to be synchronised with the main Prometheus Olympus datastore. This would allow users to be added automatically to the PTS database when they appeared in Prometheus, rather than being added by hand as had been the case up until then. The scope of the project was soon enlarged to encompass the synchronisation of both the PTS and the VL database, allowing for the automatic creation and deletion of home directories but the name of the project was never changed. It was felt that it would be beneficial for members of the CO community outwith the main Prometheus development team to create conduits as a means of promulgating knowledge of the workings of Prometheus to a wider audience and so this project was assigned to the Services Unit in the form of myself.

Initial progress

The project began in T1 2010 with the aim of having the conduit (and later both conduits) in service by the start of the 2010-2011 academic year. The need to acquire knowledge of both the internal workings of Prometheus and Perl based Object Orientated programming mean that initial progress was slow to start with but as necessary knowledge was acquired, the pace accelerated and the aim of having the conduits in service by September 2010 was achieved.

Finished? Not Quite

Although the PTS and VL conduits and data stores were in place and working satisfactorily, there still remained the small matter of the test suites before the project could be signed off. Each Prometheus conduit and data store is required to have a test suite which can be used to automatically prove that the conduit or data store is working in the expected manner. It will be realised that changing the contents of the actual databases associated with the data stores would be highly undesirable and so some way of emulating the behaviour of the underlying database, without changing the code path followed in the conduit or data store, must be found. Most Prometheus data stores front what are at heart some variety of LDAP database and it is a relatively simple matter to create and populate a test LDAP server for use with the test suites. Bringing up a test AFS cell is a far more daunting prospect and after much research and cogitation, it was decided that the only practical solution to this problem was to use the Perl Mock modules to emulate calls to the Perl AFS module. This took a considerable amount of effort, exceeding that which had been required for the writing of the conduits and data stores in the first place.

Since the conduits and data stores. were in place and working, the writing of the test suites was treated as a lower priority task. This added to the total effort required to complete the project since there was a certain amount of refreshing of knowledge required each time the task was taken up again. In hindsight, it would have been far better to give the writing of the test suites the same priority as the writing of the conduits and data stores received and get the project finished as quickly as possible.

-- CraigStrachan - 18 Jan 2016

Edit | Attach | Print version | History: r5 < r4 < r3 < r2 < r1 | Backlinks | Raw View | Raw edit | More topic actions...
Topic revision: r2 - 08 Feb 2016 - 15:13:09 - CraigStrachan
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback
This Wiki uses Cookies