Theses and Dissertations
Permanent URI for this collection
Theses and Dissertations (Computer Science)
Browse
Browsing Theses and Dissertations by Title
Now showing 1 - 18 of 18
Results Per Page
Sort Options
- ItemOpen AccessAutomation of Planning and Budgeting Procedures in a University.(Obafemi Awolowo University, 1987) Ajayi, Emmanuel Adebayo; Akinde, A. D.This work demonstrates the use of computer technology in enhancing efficiency in University Administration, particularly in respect of Academic Planning and Research Management. The project exploited the tremendous capabilities of the computer for the development of an information system to enhance decision-making specifically in the areas of academic planning and recurrent budgeting of a University. The study was carried out on the Apple II Micro-computer. The automation of academic planning and recurrent budgeting procedures in a University has been examined and implemented. The parameters and criteria which influence academic planning and recurrent budgeting exercises in the University were identified and appropriate programmes and sub-routines developed to estimate, monitor and control these parameters, in line with the National Universities Commission recurrent budgeting techniques for allocation of funds to a University, thus giving the research work a wider acceptability and applicability especially in the Nigerian context. An efficient and dynamic software support was also developed for the Academic Planning and Budgeting process in a University.
- ItemOpen AccessA Collaborative Software Development Model for Co-Located and Virtual Teams(2015-08-06) Elias, Olaronke GaniatThis study investigated the factors affecting collaborative software development in developing countries, constructed a model for collaborative software development and assessed the effectiveness. This was with a view to increasing usability of software systems and reducing risks involved in software development process. The research employed an exploratory study design to obtain information on collaborative software development practices in developing countries. Interview and questionnaire were used to obtain data from software developers in Lagos, Ibadan and Ile-Ife. Eleven software developers in six software development companies in Lagos and Ibadan were purposively selected for interview. The interview elicited information on factors affecting collaborative software development in developing countries, the effect of the factors on collaborative software development and the procedures for collaborative software development. Questionnaire was administered on fifty randomly selected software developers in Lagos, Ibadan and Ile-Ife to obtain information on the ideologies behind collaborative software development, the challenges faced by developers and approaches to mitigating risks in software development process. The collaborative software development model was constructed in Unified Modelling Language using ArgoUML Computer Aided Systems Engineering tool. The model was assessed for effectiveness using case problems in Nigeria with the application of association-end-multiplicity and class attribute criteria. The results showed that effective communication, group or team conflict, inadequate requirement analysis, interoperability, standardization and software development methodology were the factors affecting collaborative software development in developing countries. The collaborative software development model showed the interactions among the software developers engaged in collaborative software development process. The association-end-multiplicity criterion that was used for testing association among the classes in the model showed that the model was effective. Furthermore, the class attribute criterion showed that the behaviour of the model was also effective. It was concluded that effective requirement analysis, security, communication, effective software development methodology and interoperability were important for effective collaborative software development practices. It was also concluded that the constructed model would enhance collaborative software development process in developing countries and also reduce the risks associated with collaborative software development.
- ItemOpen AccessA Computer Simulation Model of Optimum Sized Tractor Selection for Agricultural Mechanization(Obafemi Awolowo University, 1985) Adagunodo, Emmanuel Rotimi; Jaiyesimi, S. B.; Mensah, E. K.A computer simulation model for the selection of optimum-sized tractors based upon the durability and minimum cost analysis of the tractor has been developed. There is the need to optimize tractor sizes so as to minimize the cost and increase the productivity of farm mechanization process. The optimal replacement period and reliability of the tractor are considered in this study with respect to the volume of utilization of such tractor. A procedure for determining tractor optimum size through the replacement period and reliability analysis is presented in the study. The results of the study establish that two tractor models - David Brown 995 and 990 - of 62 and 58 horsepower respectively are recommended for the two categories of tractor owners and users involved in the study. DB 995 has been considered adequate for the private commercial farmers who cultivate between (30-40) hectares of land and DB 990 is recommended as the optimum machine for the government-owned tractor hiring units. The results of the work show further that a few policy implications may arise from changes in the two important quantities - maintenance and purchase costs of the tractor. A low maintenance and high purchase costs lead to extended replacement period and high durability. On the other hand, high maintenance and cheap purchase price of the tractor result in shortened replacement period and low durability for the tractor. The need to stop any subsidies on tractor purchase price and provide subsidized maintenance and repairs and also establish more government-owned tractor hiring units is stressed in the study.
- ItemOpen AccessDesign and Implementation of a Runtime System for an Algol-Like Intermediate Compiler.(Obafemi Awolowo University, 1987) Akindele, Oluwatoyin Tunde; Owoso, G. O.Programming languages have been elevated to an abstract level by the development of higher level programming languages. The gap created by this abstraction between the users’ programs and the machine is bridged by means of translation systems known as compilers and interpreters. A programming language system - ALGOL-like language (ALL) that is modeled after the compiler-interpreter architecture has been designed and constructed. In this thesis, the runtime system for the intermediate compiler of ALL has been designed and implemented using two stacks (the main stack and the pointer stack) and a heap, the copying technique of garbage collection is used with the heap to simulate an infinite store. The Reverse Polish Notation has been used as the internal language of the translation system. The runtime system has been implemented using PASCAL programming language. This implementation has resulted in an efficient runtime system with little operational requirements. The major application areas of the programming language system are its use for introducing the concept of structured programming to beginners in computer science, and for enhancement of teaching of compiler/interpreter design and construction at undergraduate level.
- ItemOpen AccessDesign and Implementation of an Algol-Like Intermediate Compiler.(Obafemi Awolowo University, 1987) Ajila, Samuel Adesoye; Owoso, G. O.The gap created by the abstraction of a programming language from the machine level is bridged by means of a translation system known as a compiler. The design and implementation of this intermediate compiler is based on the Recursive Descent compiling technique. In this technique, the compiler is activated through the syntax analyser. The syntax analyser is divided into a number of recognition routines, each of which has the task of checking whether a particular kind of phrase is present in the input, and then calls upon the services of other routines to recognize the appearance of other subphrases. The routines are mutually recursive. In addition, semantic checking and code generation phases are integrated into the syntax analysis phase. Some of the results of this technique include the ability of the intermediate compiler to do its recognition, type checking and code generation without 'backup'. The total code of the compiler is very small and it is written in a high-level programming language (PASCAL). In general these advantages result in portable and more easily understood compiler. As a result of the implementation technique, this compiler can be used as a teaching aid in compiler construction class at the undergraduate level. It can also be used to introduce structured programming for a beginner in Computer Science.
- ItemOpen AccessThe Design of an Integrated Database Management System for a Nigerian University Environment.(Obafemi Awolowo University, 1985) Fisusi, Zaccheaus Rotimi; Daini, O.A.The rapid growth in the size and complexity of the Nigerian University demands a more effective and efficient information management technique than has been available or necessary in the past. In some Nigerian Universities at present, computer-based data files are established to serve personnel, payroll and student application needs. Data is selectively and frequently copied from source files and merged with transactions to construct new files appropriate for the application. In this project, an attempt has been made to design an Integrated Database management system that combines the transactions in the Personnel, payroll and student application' areas in a Nigerian University to replace the present separate file method approach for each application area. The approach is based on the concept of Relational Database model, starting with the construction of relationship graphs and entity-relationship model diagrams in the Information Structure design phase and ending with the definition of the conceptual scheme in the Information Structure Implementation phase.
- ItemOpen AccessDesign of an On-line Library Cataloguing System.(Obafemi Awolowo University, 1985) Faleye, Emmanuel Omoniyi; Jaiyesimi, S. B.The work reported in this thesis is on the design of a portable on-line library cataloging system (including serials' cataloging and binding subsystem) for possible implementation at the University of Ife library. In on-line catalog systems, instead of writing records for each book onto cards (i.e. card catalogs used presently at University of Ife libraries) and transferring these cards onto the entry card catalog shelves (author catalog shelf, title catalog shelf and subject catalog shelf), only one record needs be written onto a random storage device. But this one record is indexed so that it can be found by the use of any one of the access keys for which separate card catalogs are created in a conventional system. All records in the file are easily accessible and the whole file is consequently available for machine searching in response to each interrogation from a terminal. The various programs were developed and implemented on the TRS-80 model II micro Computer with their sample outputs in Appendix C.
- ItemOpen AccessDesign of an On-Line Library Circulation System.(Obafemi Awolowo University, 1987) Adeyekun, Basiru Gbolagun; Jaiyesimi, S. B.The work reported in this thesis is on the design of an on-line library circulation system for possible implementation at Hezekiah Oluwasanmi Library, University of Ife, Ile-Ife. The problems associated with orthodox manual library circulation processes were identified by studying library activities and functions in the Hezekiah Oluwasanmi Library. These problems include limited and restrictive information base of the manual library record keeping; undue user and staff time-input in book location and borrowing; inconvenience of processing and correlation of extensive manual records for circulation control; and human factor in penalty administration on book defaulters. This project exploits the tremendous capabilities of the computer for the development of an on-line library circulation system to solve the problems identified, and to enhance library circulation operations. An efficient and dynamic software has been developed for the on-line library circulation procedures. The design has been tested on an Eclipse C-150 Data General mini-computer and a TRS80 Model II micro-computer for later on-line implementation on a multi-tasking micro-computer. The design herein reported provides adequate and timely information which can be put to a variety of uses such as library clearance and security, enforcement of accountability, relief of users and library staff of routine clerical procedures, and bringing about an effective circulation system.
- ItemOpen AccessDesign of an Online Computerised Payroll System.(Obafemi Awolowo University, 1986) Olufokunbi, Karen Cowan; Jaiyesimi, S. B.An on-line, interactive computerised payroll system was designed, and executed on the Data General Eclipse C-150 computer system. Its features include the production of payslips and associated payroll reports via a top-down design program and the protection of the system for security and confidentiality. The test run proved successful; with the estimated consumption of 27.66 hours to perform payroll calculations for 6,000 employees, and the associated preparation of payslips and payroll reports within the range of 7.5 hours and 21.0 hours depending on which of the eighteen different reports was being considered. Compared with the existing punched card batch-oriented payroll system used at the University of Ife, the designed program was fast in execution, used less human resources and materials in major aspects, and provided more detailed information. Lack of recent cost information on hardware components and differences in output from both the existing and the designed system prevented rigorous comparative pecuniary feasibility analysis. Based however, on the features exhibited by the existing program and the designed one, the latter is recommended; with associated guidelines that will enhance its effectiveness.
- ItemOpen AccessDevelopment of a Computational Intelligent System for Short-Term Electric Load Forecasting(2015-09-29) Faleye, Abimbola RashidatThis study elicited information related to Short-Term Electric Load Forecasting (STELF) and developed an intelligent system based on the information. This was with a view to setting the basis for implementing a commercial software for electric load forecasting. Historical data on Short-Term Electric Load Forecasting for three years ranging from 2004 to 2006 (being the available data) and information on the process involved in Short-Term Electric Load Forecasting were collected from the National Control Centre, Osogbo using interview, observation and contextual inquiries as well as user diary (system logbook). The knowledge embedded in the data and information collected were elicited and represented using fuzzy logic based rules. The fuzzy knowledge space was developed using fuzzy logic tool in Matlab 7 and Fuzzy Decision Tree software (FID 3.4). The data for the year 2004 and 2005 was used to develop the system. In order to evaluate the performance of the proposed model, electric load forecasting was performed on National Control Centre data. The randomly selected data from 2004 and 2005 was used for model validation while the data for 2006 was used for model verification. An intelligent system was developed and used to produce a 24-hour ahead forecasting of electric load. The prediction results from the proposed model (Fload) and that of the conventional model (F) were compared with the actual load based on fractional errors computed. The fractional errors were the variations from the actual load. The average fractional errors for Fload and F for the different periods are: January 1st 2004 (0.12 and 0.45), June 5th 2004 (24.91 and 42.95), March 2nd 2005 (0.14 and 1.10), September 4th 2005 (0.22 and 0.88) and January Ist 2006 (0.22 and 0.78). The average fractional forecast errors for the proposed model were less than that of the conventional model to validate the effectiveness of the proposed approach. The study concluded that the use of fuzzy logic in short-term electric load forecasting gave more accurate results than using the conventional model, hence the advantage of the proposed model over the conventional model.
- ItemOpen AccessDevelopment of a Job Coordination Model for Grid Computing Architecture(2015-09-23) Adeyelu, Adekunle AdedotunThis research work developed and simulated a homogeneous architecture for job coordination on a high throughput grid computing system in order to allow for platform independent and decentralized programming of all categories of jobs in grid computing environment. A job coordination model using a shared memory data structure based on Objective Linda coordination language was formulated. The scheme worked by posting data from the worker nodes to this memory using templates and retrieving it using associative pattern matching. The model was analyzed theoretically and simulated using Java programming language. Performance analyses were carried out using the following parameters: delay time and effectiveness. The results showed that delay times changed in a predictable pattern for all events. As the number of nodes used for processing jobs increased from 100 to 700, delay time reduced from 7.425 ms/node to 5.724 ms/node and increased from 17.930 ms/node to 18.095 ms/node for migration and checkpointing events respectively. Also, as the number of nodes on the grid for migration with checkpointing events increased from 200 to 900, the delay time reduced from 38.240 ms/node to 34.640 ms/node. The results further showed that the introduction of a memory management scheme reduced the overall delay time over the conventional memory scheme by 24.64%. This study concluded that the developed model has decentralized, interoperable and homogeneous capabilities and utilized shared memory data structure concept. The simulated model proved to be effective and efficient for running different categories of jobs on grid. The scheme will attract developers of high throughput grid.
- ItemOpen AccessDevelopment of a Pathology Information System Using Mobile Agent Technology(2015-04-28) Nasir, Musiliat IbijokeThis work developed and implemented a mobile agent based software for gathering pathology investigation results with a view to reducing the delay involved in the system. The software developed was based on the waterfall model. The requirement analysis for the software developed was carried out by selecting pathologists in four hospitals. Focus group interviews were carried out to establish the existing procedure of pathology investigation result gathering. The modules identified after the focus group interview consisted of Configure, Search, Send report, Retrieve result and Update registration. These modules were designed using object oriented technique. The designed modules were implemented using the Java Programming Language, while its database was implemented using the Microsoft Access 2000 engine. The developed software was tested using rapid prototyping technique. The mobile agent based system developed was implemented and tested using four personal computers which run on heterogeneous platform and a bandwidth of 5kbps. The mobility feature enables mobile agent to migrate to remote hosts where information is stored to execute user's request autonomously. "The tested parameters in the interaction include Full Blood Count (FBC), Packed Cell Volume (PCV), White Blood Cell (WBC), isolate sensitivity to antibiotics like Penicillin G (Benzyl Penicillin), Ampicillin, Nitrofurantoin, Bilirubin Total, Conjugate Bilirubin, Calcium and Protein. The result showed that for a single request, the transmission time was 1 second for both the proposed technique which is agent based and non-agent based technique for a bandwidth of 5kbps. However, for more than one request using the same bandwidth, the transmission time for the proposed model was 5 seconds as against the transmission time of 10 seconds for the non-agent based system. The proposed approach provides cost effective solution and generate lower server delay overhead which resulted into 98% efficiency as against non-agent system with efficiency of 55%. In addition, the designed and implemented agent based system has quicker retrieval of pathology investigation results from hospitals in the network. It is concluded that this mobile agent based system is a recommendable infrastructure to facilitate collaboration among health care providers for effective diagnosis and treatment of patients.
- ItemOpen AccessDevelopment of a Server Switching Model for Internet Performance Improvement(2015-08-24) Adeosun, Olajide OlusegunThis study developed a model and an algorithm for simulating the automatic server switching Internet connectivity. It also evaluated the performance of the developed model. This was with a view to eliminating the roll back problems associated with current Standby Replacement scheme and thereby providing uninterrupted Internet services. A non-intrusive server switching Internet system connectivity la, as designed using Markovian a.nd Stationarity processes to identify and absorb faults impairing the Internet services performance. System failure was measured discretely, or "counted", together with stationarity ruses using Markovian processes. The Markovian and Stationarity processes were used to develop the algorithm for the simulation program that implemented the server switching process. A simple architecture based on Basic Fault-Tolerant architecture was developed for the server switching. This was simulated using Microsoft Visual Basic version 6.0. The performance of the system was tested on Series, Triplicated Modular Redundancy and Standby Replacement schemes. The performance evaluation of the developed algorithm was carried out by written codes for the implementation of the algorithm in Microsoft Visual Basic version 6.0. The results showed that the model of a single connection is characterized by three non-negative parameters: the transition rate λj from state 0 to state 1, the transition rate μj; from state 1 to state 0, and the transfer rate Bwj; when in state 1. The algorithm defined O j (t ) the total time spent by the network connection from proxy server Sj to the client in state active (i.e. Xj (t) = 1) during, the interval [0, t] which represented the operational time distribution of the Markov process X over the interval [0,t]. Also, the results showed that server switching Internet connectivity had Internet performance improvement of 5.96% over the existing single server model of Internet connectivity. Markovian and Stationarity rules enabled automatic switching over from faulty server to the next immediate viable spare. Similarly, the rollback problem associated with Standby Replacement scheme during system down time was removed by deployment and redeployment of faulty server. The developed model exhibited 99.98% system availability with 0.019% deviation from the expected standard (99.999%) while the existing Standby Replacement scheme had 99.68 system availability with 0.319% deviation. The study concluded that the developed automatic server switching technique provided increased system availability for Internet users.
- ItemOpen AccessDevelopment of an Enhanced Accounting Scheme for Grid Computing Architecture(2015-04-10) Akinwunmi, Akinwale OlusegunThis research developed a modification to the Grid Accounting Scheme (Gridbank) by formulating a model to enhance the scheme. The enhanced scheme was simulated and its performance was evaluated with a view to eliminating the manual mode of processing, speed up transaction and reduce time delay. The Paypal layer was added to the existing three layers which enhanced the scheme to allow the automation of the GridBank administration module. The enhanced scheme was formulated using the web service approach that allowed cross platform interoperability. A web service was created using Hypertext Preprocessor (PHP: a web development language) and My Structure Query languange (MySQL: a relational database management system) for establishing the link between the Paypal system and the existing layers. The scheme was simulated using visual modeler. Processing delay and load scalability were used to assess the performance of the two schemes. The results of the simulated model were analyzed and interpreted. Processing delay was specified as a function of processing time and number of users while load scalability was specified as a function of load and available number of resources during three different periods of operation namely: peak, off-peak and holiday; in order to analyze the model. The results of the simulation showed that as the number of users increased, the processing time gradually reduced for the enhanced scheme hence the processing delay of the enhanced scheme reduced and its curve had an R2 value of 0.96. However, as the number of users increased, the processing time increased for the existing scheme hence its processing delay increased its curve had an R2value of 0.32. Also as the number of available resources increased the enhanced scheme scaled the load properly with R2 values of its curves; 0.05 (Peak Period), 0.03 (Off-Peak Period) and 0.42 (Holiday Period) as against 0.02 (Peak Period), 0.01 (Off-Peak Period) and 0.25(Holiday Period) of the existing scheme with less number of resources. It was concluded that the enhanced accounting scheme provided the required automation for efficient and secure grid accounting operations. This distinguished it from the existing scheme.
- ItemOpen AccessDevelopment of an Improved Quality of Service Model for Electronic Commerce(2015-03-31) Ajayi, Anuoluwapo OlanrewajuThis research designed an intelligent based bandwidth management model which was simulated and evaluated based on performance metrics, and a prototype of the model was implemented. A survey of the existing QoS provisioning models and their modus operandi were carried out. Eighty questionnaire were administered in Southwestern Nigeria to identify the requirements and the needs of e-commerce clients that are peculiar to developing countries. Findings from the data gathering stage prompted the proposition of an intelligent QoS provisioning model based on Unified Modelling Language (UML). The sub-modules of the model consisted of a Fuzzy Semantic Information Retrieval (FSIR) engine, which returns a ranked list of objects that have been estimated to satisfy the user's request; a Fuzzy Inference based Bandwidth Manager (FIBM) which determines the proportion of the service rate that is assigned to user's request, and a Fuzzy Inference based Data Compressor (FIDC) which determines the image variant to transmit depending on the traffic and clients' systems conditions. The Fuzzy Logic toolbox from Matlab was used to model the sub-modules. A prototype of the model was implemented as a web based n-tier application comprising of the client tier, the web tier and the enterprise tier using Java 2 Enterprise Edition (J2EE) language. In order to investigate the performance of the proposed model, a simulation program was developed using MATLAB modelling language. In the simulation, clients' systems characteristics, their purchase histories and page sizes after a search operation is performed were randomly generated from statistical distributions. The daily traffic pattern of the Obafemi Awolowo University Network was used to model the traffic in the simulation. The ability of the proposed model was benchmarked with the current Internet Best-Effort model using the QoS performance metrics such as response time, throughput, and latency. The simulation result revealed a better performance of the proposed model over the Best Effort model with about 52% reduction in system response time, 66% reduction in system latency, and 115% increment in system throughput at a bandwidth of 1Mb for a network of 600 clients. The MATLAB normal probability plot when used to analyse the results at 95% confidence intervals, indicated that 75th percentile of the observations under the Proposed had response time of less than 8.5 seconds, latency of 0.85 second, and a throughput of 56%. The effect of using sub-standard systems to query a web server, a feature that is absent in existing QoS provisioning schemes, was also investigated. Clients using sub standard systems under the proposed and Best Effort model had mean response times of 12.43s and 22.04s with standard deviations of 0.2s and 3.5s respectively, during the off peak and peak load conditions. Thus the proposed model produced consistent system response time for clients. The result obtained from the prototype implementation was in good agreement with the simulations. It was concluded that the robust and intelligent QoS provisioning model developed and implemented in J2EE provided effective management of quality of service of web applications, particularly where computing and network resources are limited.
- ItemOpen AccessDigital Simulation of Stochastic Differential Equations.(Obafemi Awolowo University, 1985) Adewumi, David Olambo; Jaiyesimi, S. B.The study of stochastic differential equations started with mathematicians using them as tools in the solution of physical problems. In science and engineering, s.d.e. arises in a natural phenomenon known as "white noise". Some of the major works done in this area of study has been in the analytical solution of s.d.e., very little attention has been paid to digital simulation techniques. In this thesis we have evolved a digital and analog simulation technique for solving s.d.e s. We have used the TRS-80 model II microcomputer system at the University of Ife, Nigeria and the 680 analog/parallel logic computer system of the University of Sussex; U.K. in our digital simulation procedures. We have used the improved Euler’s and Runge-Kutta's methods of numerical integration. We have solved some problems in science and engineering using the digital simulation techniques evolved. These problems are s.d.e. describing: the white noise, the Lagevin's equation, the influence of a rapidly fluctuating density of the earth on the motion of a satellite in a circular orbit, the motion of a rigid body rotating under a random force, the Fokker-Planks equation. We have also considered the convergence of the results and the probable error in the simulation experiments.
- ItemOpen AccessImpact of Information and Communications Technology on the Operations of the Nigerian Capital Market(2015-04-23) Idowu, Akinyele AkinwumiThis study identified operational activities of the Nigerian Stock Exchange (NSE) and the extent of the use of Information and Communications technology (ICT) for the enhancement of its operations. This was with a view to assessing the impact of ICT on the performance of ICT. The data for the research were obtained from both primary and secondary sources. Primary data were collected using structured and unstructured questionnaire backed up by interview and observations. Twenty five (25) Stock- broking firms were purposely selected from the two hundred and seventeen (217) registered in the Stock Exchange based on their, operational capacity, age, spread and experience. In each of these Stock broking firms fifteen (15) investors were also purposely selected based on their net worth. A total of three hundred and seventy five (375) respondents. Other respondents from various groups working in the Stock Exchange were randomly selected. These were from: Corporate Law firms, Facility back-up providers, Reporting Accountants and the Nigerian Stock Exchange Staff – a total of twenty (20) respondents and twenty five (25) Jobbers and Dealers in the Stock Exchange were selected. Altogether the total sampled size was four hundred and twenty (420). The study showed that all operational activities of the NSE had became automated in all the selected Stock-broking firms. Identified was a switch from manual call over system to Automated Trading System (ATS). Also introduced to the operation of NSE were Central Securities Clearing System (CSCS), Trade Alert (TA) and Remote Trading (RT). These technical changes brought about increased level of confidence in the stock exchange. Majority of the respondents (70%) expressed much higher degree of confidence in ICT-backed NSE operation. The result also showed that All Share Index (ASI) and Market Capitalization (MC) were positively correlated. The degree of correlation was high in the post automation era (r = 0.94, t = 1306, P < 0.05) than pre automation era where r = 0.75 (t = 96, p < 0.05). The Mean All Share Index also showed better performance in the post automation era than pre automation era. (t = 3.93, P < 0.05). The study concluded that Automated Trading System and Central Securities Clearing System had brought increase in the confidence level of the investors and consequent increase in the volume of activities owing to transparency and promptness of service delivery that ICT has provided.
- ItemOpen AccessRequirements Specification Using Activity Analysis and Design Framework for Primary Health Care Information System in Ife Central Local Government Nigeria(2015-03-27) Afolabi, Adekunle OluseyiThis research assessed the existing manual Primary Health Care Information System and identified actors and their contributions to Information System Development with a view to building effective requirements specification for developing a Computer Based Primary Health Care Information System. Primary data were collected from several relevant actors at the Primary Health Care centre, Enuwa in Ife Central Local Government using interview and participatory techniques. The Activity Analysis and Design (ActAD) checklist was used as a guide for the interview. Actors were identified using the ActAD framework. Documents were inspected and the existing system processes in the centre were observed. The relevant actors were also observed at work. The ActAD framework was used to analyse the work activity at the centre and in identifying the stakeholders. Volere requirements specification template was employed to document the results. The result showed that there were 24 actors who use patients' Information at the Primary Health Centre. Such actors included health managers, health workers and other research agencies. The research also showed that most of the actors contributed to the development and use of Information System at the primary health care level through direct data capturing and processing. Contributions of others were in the areas of reports' appraisal, trend of diseases and drug use. A list of 46 functional requirements such as patients' biographic data, data on family planning and immunization that the system would capture and process, was obtained through the information elicited from various actors. Over 80% of the actors interviewed agreed on the relevance of each of the requirements specified. More than 75% of the requirements were confirmed to be relevant during validation. The assessment of the existing manual system showed that the Primary Health Care System hitherto was disjointed and had no well defined requirements and hence flawless computerisation could not be easily achieved. It was concluded that the existing system was disjointed and had no well defined requirements. Therefore for an efficient Computer-Based Primary Health Care Information System to be built there was a need for proper coordination and a set of effective and flawless requirements specification produced by all stakeholders.