From the first analytical engine by Charles Babbage to the latest robotics technology, information technology has come a long way. Computer Science and Information Technology are responsible for majority of the economic growth worldwide for the last two decades and more. Information technology has changed tremendously since the time of its arrival. Initially, office that had computers used computers employing Microsoft office to do some of their operations.
Things started booming and now IT is integrated with almost all the operations of an organization. Computer Science department has become the central department which is connected with all the departments of an organization. All the tasks are performed with the help of computers. The whole supply chain is managed with the help of softwares and hardwares that support the software. We can witness numerous automation softwares in the market that are there to help organizations in performing all the tasks with them. One of the most common example of these softwares is Enterprise Resource Planning. It has the ability of handling all the operations of an organization, such as distribution, sales, manufacturing, inventory management, invoicing, etc. It consists of various separate functions that are all integrated with each other and perform all operations accurately.
Historically, IT started with DOS type interfaces after the analytical engine and early room-sized machines. Programmers and computer experts were the only one who could use the initial interfaces, and even then only those programmers who had the knowledge of that particular languages. All the computer languages were so difficult that in order to use another language, a programmer had to learn it from scratch. All were different from each other and extremely powerful. With the advent of computers, majority had the perception that they are going to stay connected with science and machines. However, with the development of Windows and office by Microsoft, the perception changed significantly. Now computers could also fit in offices and in some homes. With research and advancement of technology, people realized that the next age is going to be the age of information technology. Networks were introduced and this gave computers all the popularity and acceptance they could get.
End users came to realize the extensive ability of computers of facilitating humans in doing their jobs speedily and accurately. As technology became more and more significant, the size of the computers started to decrease gradually. From the first computers that were of the size of an entire to a computer that has size equal to that of a palm, information technology has come a long way and this has happened in mere 20 years or so. Students from any field have to be familiar with information technology in order to be successful. Even medical students, students of arts, cannot deny this that they need information technology in performing their operations. Information technology has helped medicine tremendously. More and more sophisticated technology is being used for taking care of patients (Allen & Morton, 1994).
The future of IT
Information Technology has boosted the growth of all organizations and IT departments of all enterprises have to handle ever-increasing volumes of data. The competitive environment of the 21st century is demanding all organizations to come up with new strategies and products, especially IT related products. After the integration of IT in all the departments of life, the next step is towards real-time automation. Now information technology enthusiasts are in search of softwares that could think. They would be able to reduce costs, are more reliable, and more adaptive to experiences. The next focus of developers is on information infrastructures that have thinking ability. The new technology would be able to understand and adapt to the changes that happen in the environment and the changes in data itself. This research is mostly referred as ‘Artificial Intelligence.” It would be able to make the necessary adjustments itself.
This would enable the systems to be more reliable and reduce breakdown of systems. System administrators would be able to make more changes without having to do huge programming. Moreover, the system would have learning capability. This would allow them to perform optimally. Developers are also researching for adaptive data integration, which would facilitate organizations tremendously. It involves adjusting to standards while minimizing operational changes that would be required for the adjustment. Adaptive integration would also result in reduction of bugs and facilitate in reduction of implementation and maintenance costs. The rate at which the systems perform would reduce tremendously resulting in optimal performance in minimal time.
Research on artificial intelligence has been going on for over a decade now. However, the research has not been implemented fully due to various reasons. One of the major reasons is lack of properly planning and management. Mostly, this research is being carried out by a group of individual who are doing most of their work on ad hoc basis. This causes a number of projects to fail because in such cases analysis and realization of risks is close to zero and when the risks become actual problems, the organization or the project team has no equipment to deal with it. The next section of the paper discusses project management and risk management in context of research in information technology and how proper project management could facilitate research (Fryman, 2004).
Problems with IT and Research
In spite of the fact that IT has had phenomenal growth, the number of graduating computer science and IT graduates has started declining dramatically. There are various reasons for this decline. One major cause is the perception that there is job saturation in the IT market, which is obviously not true. Another reason is inadequate preparation of mathematics and science in the earlier years of study.
Besides these causes, the strictness in the immigration policies has also discouraged students from taking computer science or IT as major courses. Students also have incorrect information regarding various fields and their growth opportunity. Media has also been responsible for this decline due to false claims of a declining IT growth in account to the Y2K bug and several other similar viruses. However, for research to boom in the field, we need to remove the misconceptions and bad publicity. Moreover, proper mentoring in the field is also necessary (Microsoft Research, 2006).
Software Project Management
We witness that new projects and software development has declined tremendously. One of the reasons for this is lack of proper software project management. Every thing whether big or small needs to be properly planned to turn out to be a success. The same goes for software development. The tremendous increase in the demand of information technology during the 21st century, development of numerous projects has taken place. However, programmers do most of their work in a ad hoc manner without any proper planning. This exercise has been successful for most of the time but now times are changing and with increase in the sophistication of technology and expenses, we need proper planning and research in software development.
This would ensure that we develop softwares that are developed within time and cater to the requirements. Project management involves planning, organizing and the management of resources so that it facilitates in the achievement and completion of the goals and objectives of the project. A project has specific start and completion dates. Most of the times both the start and the completion dates are all forecasts and they could be changed. The project task is taken so that it is able to benefit a specific group or in the case of generic software benefit a large group of people. Project are finite whereas processes are iterative operations that are used to produce the same product or provide the same service. This difference makes the management of a project a lot more different from the management of a process (Kelsey, 2006).
The core objective of software project management is to achieve the goals of the project while maintaining standards. Those standards are usually scope, quality, time and compliance with the budget. The secondary objective of project management is optimization of the required inputs and resources necessary to complete the project. The following are the tasks that lie in the domain of software project management (Wikipedia, 2008):
1. Planning and jotting down objectives; deliverables and milestones
2. Planning of the project according to the deliverables
3. Managing risks and controlling them.
4. Accurate approximation of resources
5. Organizing work
6. Accessing resources both man power and material
7. Adequate assignment of tasks
8. Regulating activities
9. Controlling execution of project activities
10. Monitoring and tracking progress
11. Analysis of results based on up to date achievements
12. Definition of project products and services
13. Prediction of future developments in the projects through research
14. Maintaining quality and standards
15. Proper management and solution of issues that arise during project development
16. Bugs and error prevention
17. Identification and management of changes
18. Finalizing the project
19. Communication with stakeholders or prospective clients
All of the above mentioned steps are critical to the life of any project and if any of them is not managed properly, then it might have drastic effects on the whole project. With technology becoming more and more advanced as each day passes, software developers and managers need to adapt some standards so that uniformity is possible and most importantly, the developed softwares meet the requirements and development is carried out within the deadline. These are two most important issues when it comes to software development.
Another important aspect during the development of project and specifically research projects is risk management. Developers and managers tend to overlook this phase. This happens especially with research-based projects and the projects fail. The failure of the project is then imposed on the failure of the research, but actually, it is the management that failed and not the project itself. Therefore, identification of risks and proactive strategy is necessary for projects to succeed. One reason behind the failure of various research is that developers and organizations under taking the research are not taking project management seriously (Jalote, 2002).
Risk management is an organized approach towards the management of an uncertainty that exists during the life of a project. The uncertainty could be a threat or a possible cause of failure of the project. Risk is the prediction of a problem that might exist and disable the project in fulfilling the goals that it is supposed to achieve. Various school of thoughts exists towards managing risks. The strategies include avoidance, reduction, acceptance and planning for the consequence. Different categories of risk management also exist such as strategic risk management, financial risk management (Boehm, 1997).
The main target of risk management is to reduce the probability of a risk becoming a problem for a project. In an idea risk management environment, risks are prioritized on the basis of capacity of loss and probability of occurrence. The risk posing great loss threats and high probability of loss are prioritized at the top of the list and are handled first. However, this type of risk management could have some form of mishandling in the sense that it is difficult to prioritize between risks with high probability of occurrence and the risks with high loss characteristics. Another form of risk management is intangible risk management. In this form, the manager identifies a new risk, which has a 100% probability of occurring.
The organization is ignoring the risk because it has not identified it. A very simple yet applicable case is when an organization applies minimal knowledge to a situation then the risk that exists in that case is a knowledge risk. Other risks that could exist within the management of organization is relationship risk which exist because of ineffective cooperation and process management risks exist when faulty operational procedures are used. Risks that exist during the development of projects could be classified into the following categories (Galorath & Evans, 2006):
Incorrect Requirement gathering – Some projects whether generic (research based) or customized are unable to meet requirements because initially the requirement gathering process was incorrectly performed. This poses a major threat to the successful completion of the project. The risk factors exist when the project team lacks the vision for the product, or is unable to cope with the rapid change of requirements.
Dependencies issues – In some cases, project completion depends on a number of external factors that are no controllable by the project team. This increases risks for the project. It is extremely difficult to control these dependencies.
Management issues – Usually, it is the project manager who is responsible for formulating the risk management plan and if he does not write the plan correctly and leaves some issue due to personal or professional reasons then the project would definitely fail.
Lack of knowledge issues – Software technology both software and hardware change rapidly and sometimes organizations find it difficult to cope with these changes. In order to handle this risk, the only method is to increase the number of trained people in a project (Pandian., 2006)
Software development teams and enterprises need to integrate proper project and risk management in project development. This would ensure that the project meet requirements, deadlines and do not exceed forecasted costs, which is also a problem for on-going projects. Moreover, if researches are successfully implemented, it would ensure more motivation towards ongoing researches which is necessary for the growth of information technology. Project management is often looked at as being useless by developers. However, they must understand the difference that proper project management can make to the success of any project. Project management not only helps developers in understanding the requirements, it also helps them in synchronizing various tasks and meeting deadlines. Moreover, it reduces risks by proper risk realization and proactive strategy.
Allen, T., and M.S. Morton, eds. 1994. Information Technology and the Corporation of the 1990s. New York: Oxford University Press.
Harriet Fryman. The Future of IT is automations : pg 1-4. March 1, 2004. Retrieved on July 20, 2008 from < http://www.cioupdate.com/reports/article.php/11050_3319601_1 >
Microsoft Research, External research Program. The Future of Information Technology. Retrieved on July 20, 2008 from < http://research.microsoft.com/Workshops/FS2006/papers/TheFutureofInformationTechnology.pdf>
Robert Bruce Kelsey. Software Project Management: Measures for Improving Performance. Published 2006. Retrieved on July 20, 2008 from < http://books.google.com.pk/books?id=VMH2YGL5aiIC&dq=software+project+management&source=gbs_summary_s&cad=0>
Wikipedia. Project Management. Retrieved on July 20, 2008 from < http://en.wikipedia.org/wiki/Project_management>
Dr. Boehm. Introduction to Software Risk & Risk Management. Retrieved on July 20, 2008 from < http://www.baz.com/kjordan/swse625/intro.html>
Pankaj Jalote. Software Project Management in Practice. Published 2002. Retrieved on July 20, 2008 from < http://books.google.com.pk/books?id=JUwQz2A_k_gC>
Daniel D. Galorath, Michael W. Evans. Software Sizing, Estimation, and Risk Management. Retrieved on July 20, 2008 from < http://books.google.com.pk/books?id=MQL45_XhyHYC>
C. Ravindranath Pandian. Applied Software Risk Management: A Guide for Software Project Managers. Retrieved on July 20, 2008 from < http://books.google.com.pk/books?id=ZSrxDf8GrioC>