Enrollment System Essay Sample
- Word count: 5149
- Category: computer
A limited time offer!
Get a custom sample essay written according to your requirements urgent 3h delivery guaranteedOrder Now
Enrollment System Essay Sample
Interest in information system has increased during the recent years not only in education but also in all areas where resources are managed. Two main reason account for this- the increasing population and the need for improved problem-solving tools.
Student information system has always been a difficult task, but it is more so today than ever before, where administrators uses the traditional way of filing records on a cabinet. As the population of the students goes up, it is becoming more complex. Data should be stored in safer places, and can be retrieved easily and fast when someone needs it. Administrator’s task has becoming more complex, there have been efforts to improve the effectiveness of problem solving and central to this are quantitative techniques and electronic devices such as computers.
In the field of education, researchers and theorists have focused intensively in recent years on examining the concepts and use of information to assist administrators, teachers, students and parents. Others have raised and discussed fundamental issues and uses of school information system to facilitate judgment and decision-making in schools.
Schools, like any other organization used to manage all sorts of data and information to ensure attainment of its goals and objectives. The emerging needs in most schools for accurate and relevant data and reliable information strengthen the Student Information System.
BACKGROUND OF THE STUDY
Computerization is a control system that manages processes in industrial workplace. It reduced human errors and processing time, thus it can boost productivity and resulted into high quality of product produce. In Information System, computerization is concerned about interrelating different but interdependent transactions. This can result in a system with well-integrated processes that can perform much faster and more accurate than a manual system.
Enrollment is the process of entering and verifying data of student to register on a particular school. Different interrelated processes build up enrollment procedures called Enrollment System (ES). ES are used particularly in recording and retrieving student’s information. Tracking student’s information is also one feature of ES, in which the school can trace the standing of a student. Verifying payments was also added to update or browse student’s billings.
Enrollment System is a good example of a computer generated process. This can lessen the workload and provides accurate information needed of the school. As a result, it will benefit not only the student but the administration as a whole. Enrollment System is very essential in a school. In the case of Nyongani School Inc., it is composed of a manual system. Directress used manual system in recording and retrieving student’s information. She also has information about student’s payment. In fact, she does all the record keeping just by using ballpen and columnar sheet. On the other hand, Registrar Department also used manual system as a way of recording and retrieving student information. Another department is Accounting Office that administers student’s payment manually.
OBJECTIVES OF THE STUDY
Because of the rapid growth of the student’s population in Charis Christian Institute, there are lots of problems encountered by both parents and the school administration. By using the manual enrollment system, problems such as time consuming production of information, unable to make corrections in student’s record, and tracking of student’s profile were not done fast and effective.
To develop an Automated Enrollment System for the Grade School and High School Departments of Charis Christian Institute – Imus that will provide time consuming and accurate enrollment. Specific Objectives
• Provide a computerized enrollment system that has the ability to track records, make corrections and that can generate data fast and effective. • Provide a system that can secure all the information and record of the students. • Provide software that will lessen the workloads of the teachers and the administration. • To have a system that can give the students a successful enrollment system. SCOPE AND LIMITATION
Scope is a written range of view or action, outlook, hence, room for the exercise of function it is the capacity for achievement, all in connection with a designated project. Scope
•The enrollment system of Grade School and High School of Charis Christian Institute – Imus. •File Management System
oOrganizing of records such as Adding, Deleting, Updating and Searching. oDatabase Management
oThe system can only be accessed by authorized personnel.
Limitation is setting boundaries to the system, unto what extent does the proposed system can do. •The system is intended for Charis Christian Institute – Imus only. •It will not cover other related information system of student such as accounting, registration system etc.
This chapter presents the review of related literature and studies underlying the framework of the study. It includes the conceptual model of the study and the operational definition of terms.
REVIEW OF RELATED LITERATURE AND STUDIES
In exploration, we find new techniques, new knowledge, even develop new substances, gadgets, equipment, processes or procedures, imagination and skill is employed by the researcher. The commodities, new devices, services, in technology are needs of man for a better fuller life which is the concern of the research. These useful arts are the products of the technological environment and the end-user is society in general. The excerpt was stated by Josefina Estolas in the book Fundamentals of Research (1995). Science and technology are essential for national development and progress. The State shall give priority to research and development, invention, and their utilization, and to science and technology education according to the 1987 Philippine Constitution (Article XIV, Section 10). Since computer power was the critical resource, efficiency of processing became the main goal. Emphasis was placed on automating existing process such as purchasing or paying, often within single department as indicated by Jeffrey A. Hofer on Modern System Analysis and Design (1996).
According to Wikipedia.org a system is a set of interacting or interdependent components forming an integrated wholeor a set of elements (often called ‘components’ ) and relationships which are different from relationships of the set or its elements to other elements or sets. Fields that study the general properties of systems include systems theory, cybernetics, dynamical systems, thermodynamics, and complex systems. They investigate the abstract properties of systems’ matter and organization, looking for concepts and principles that are independent of domain, substance, type, or temporal scale. Some systems share common characteristics, including:
•A system has structure, it contains parts (or components) that are directly or indirectly related to each other; •A system has behavior, it contains processes that transform inputs into outputs (material, energy or data); •A system has interconnectivity: the parts and processes are connected by structural and/or behavioral relationships. •A system’s structure and behavior may be decomposed via subsystems and sub-processes to elementary parts and process steps. The term system may also refer to a set of rules that governs structure and/or behavior. Alternatively, and usually in the context of complex social systems, the term institution is used to describe the set of rules that govern structure and/or behavior.
The word system in its meaning here, has a long history which can be traced back to Plato (Philebus), Aristotle (Politics) and Euclid(Elements). It had meant “total”, “crowd” or “union” in even more ancient times, as it derives from the verb sunìstemi, uniting, putting together. “System” means “something to look at”. You must have a very high visual gradient to have systematization. In philosophy, before Descartes, there was no “system”. Plato had no “system”. Aristotle had no “system”. (Marshall McLuhan in: McLuhan: Hot & Cool. Ed. by Gerald Emanuel Stearn. A Signet Book published by The New American Library, New York, 1967, p. 288).
HISTORY OF SYSTEM
In the 19th century the first to develop the concept of a “system” in the natural sciences was the French physicist Nicolas Léonard Sadi Carnot who studied thermodynamics. In 1824 he studied the system which he called the working substance, i.e. typically a body of water vapor, in steam engines, in regards to the system’s ability to do work when heat is applied to it. The working substance could be put in contact with either a boiler, a cold reservoir (a stream of cold water), or a piston (to which the working body could do work by pushing on it). In 1850, the German physicist Rudolf Clausius generalized this picture to include the concept of the surroundings and began to use the term “working body” when referring to the system.
One of the pioneers of the general systems theory was the biologist Ludwig von Bertalanffy. In 1945 he introduced models, principles, and laws that apply to generalized systems or their subclasses, irrespective of their particular kind, the nature of their component elements, and the relation or ‘forces’ between them. Significant development to the concept of a system was done by Norbert Wiener and Ross Ashby who pioneered the use of mathematics to study systems. In the 1980s the term complex adaptive system was coined at the interdisciplinary Santa Fe Institute by John H. Holland, Murray Gell-Mann and others. My list of basic tools is a partial answer to the question about what has changed: Over the past few years, large numbers of programmers have come to depend on elaborate tools to interface code with systems facilities. (Bjarne Stroustrup 1995)
As defined in wikipedia.org Computer software, or just software, is a collection of programs and related data that provides the instructions for telling a computer what to do and how to do it. Software refers to one or more computer programs and data held in the storage of the computer. In other words, software is a set ofprograms, procedures, algorithms and its documentation concerned with the operation of a data processing system. Program software performs the functionof the program it implements, either by directly providing instructions to the digital electronics or by serving as input to another piece of software. The term was coined to contrast to the old term hardware (meaning physical devices). In contrast to hardware, software “cannot be touched”. Software is also sometimes used in a more narrow sense, meaning application software only. Sometimes the term includes data that has not traditionally been associated with computers, such as film, tapes, and records.
Computer software is so called to distinguish it from computer hardware, which encompasses the physical interconnections and devices required to store and execute (or run) the software. At the lowest level, executable code consists of machine language instructions specific to an individual processor. A machine language consists of groups of binary values signifying processor instructions that change the state of the computer from its preceding state. Programs are an ordered sequence of instructions for changing the state of the computer in a particular sequence. It is usually written in high-level programming languages that are easier and more efficient for humans to use (closer to natural language) than machine language. High-level languages are compiled or interpreted into machine language object code. Software may also be written in an assembly language, essentially, a mnemonic representation of a machine language using a natural language alphabet. Assembly language must be assembled into object code via an assembler.
HISTORY OF SOFTWARE
The first theory about software was proposed by Alan Turing in his 1935 essay Computable numbers with an application to the Entscheidungsproblem (decision problem).Colloquially, the term is often used to mean application software. In computer science and software, software is all information processed by computer system, programs and data. The academic fields studying software are computer science and software engineering. As more and more programs enter the realm of firmware, and the hardware itself becomes smaller, cheaper and faster as predicted byMoore’s law, elements of computing first considered being software, joining the ranks of hardware. Most hardware companies today have more software programmers on the payroll than hardware designers, since software tools have automated many tasks of Printed engineers. Just like the Auto industry, the Software industry has grown from a few visionaries operating out of their garage with prototypes.
Steve Jobs and Bill Gates were the Henry Ford and Louis Chevrolet of their times, who capitalized on ideas already commonly known before they started in the business. In the case of Software development, this moment is generally agreed to be the publication in the 1980s of the specifications for the IBM Personal Computer published by IBM employee Philip Don Estridge. Today his move would be seen as a type of crowd-sourcing.Computer hardware companies not only bundled their software, they also placed demands on the location of the hardware in a refrigerated space called a computer room. Until that time, software was bundled with the hardware by Original equipment manufacturers (OEMs) such as Data General, Digital Equipment and IBM. When a customer bought a minicomputer, at that time the smallest computer on the market, the computer did not come with Pre-installed software, but needed to be installed by engineers employed by the OEM. Computer hardware companies not only bundled their software, they also placed demands on the location of the hardware in a refrigerated space called a computer.
Most companies had their software on the books for 0 dollars, unable to claim it as an asset (this is similar to financing of popular music in those days). When Data General introduced the Data General Nova, a company called Digidyne wanted to use its RDOS operating system on its own hardware clone. Data General refused to license their software (which was hard to do, since it was on the books as a free asset), and claimed their “bundling rights”. The Supreme Court set a precedent called Digidyne v. Data General in 1985. The Supreme Court let a 9th circuit decision stand, and Data General was eventually forced into licensing the Operating System software because it was ruled that restricting the license to only DG hardware was an illegal tying arrangement. Unable to sustain the loss from lawyer’s fees, Data General ended up being taken over by EMC Corporation. The Supreme Court decision made it possible to value software, and also purchase Software patents.
There are many successful companies today that sell only software products, though there are still many common software licensing problems due to the complexity of designs and poor documentation, leading to patent trolls. With open software specifications and the possibility of software licensing, new opportunities arose for software tools that then became the de facto standard, such as DOS for operating systems, but also various proprietary word processing and spreadsheet programs. In a similar growth pattern, proprietary development methods became standard Software development methodology.
More and more major businesses and industries are being run on software and delivered as online services – from movies to agriculture to national defense. (Marc Andreesen 2000)
According to Wikipedia.org a database is a organized collection of data. The data is typically organized to model relevant aspects of reality (for example, the availability of rooms in hotels), in a way that supports processes requiring this information (for example, finding a hotel with vacancies). The term database is correctly applied to the data and their supporting data structures, and not to the database management system(DBMS). The database data collection with DBMS is called a database system. The term database system implies that the data is managed to some level of quality (measured in terms of accuracy, availability, usability, and resilience) and this in turn often implies the use of a general-purpose database management system (DBMS). A general-purpose DBMS is typically a complex software system that meets many usage requirements to properly maintain its databases which are often large and complex. This is specially the case with client-server, near-real time transactional systems, in which multiple users have access to data, data is concurrently entered and inquired for in ways that preclude single-thread batch processing. Most of the complexities of those requirements are still present with personal, desktop-based database systems.
Well known DBMSs include Oracle,Sybase, FoxPro, IBM DB2, Linter, Microsoft Access, Microsoft SQL Server, MySQL, PostgreSQLand SQLite. A database is not generally portable across different DBMS, but different DBMSs can inter-operate to some degree by using standards like SQL and ODBC together to support a single application built over more than one database. A DBMS also needs to provide effective run-time execution to properly support (e.g., in terms of performance, availability, and security) as many database end-users as needed. A way to classify databases involves the type of their contents, for example: bibliographic, document-text, statistical, or multimedia objects. Another way is by their application area, for example: accounting, music compositions, movies, banking, manufacturing, or insurance. The term database may be narrowed to specify particular aspects of organized collection of data and may refer to the logical database, to the physical database as data content in computer data storage or to many other database sub-definitions.
HISTORY OF DATABASE
The database concept has evolved since the 1960s to ease increasing difficulties in designing, building, and maintaining complex information (typically with many concurrent end-users, and with a large amount of diverse data). It has evolved together with database which enable the effective handling of databases. Though the terms database and DBMS define different entities, they are inseparable: a database’s properties are determined by its supporting DBMS. The dictionary cites a 1962 technical report as the first to use the term “data-base.” With the progress in technology in the areas of processors, computer memory, computer storage and computer networks, the sizes, capabilities, and performance of databases and their respective DBMSs have grown in orders of magnitudes.
For decades it has been unlikely that a complex information system can be built effectively without a proper database supported by a DBMS. The utilization of databases is now spread to such a wide degree that virtually every technology and product relies on databases and DBMSs for its development and commercialization, or even may have such embedded in it. Also, organizations and companies, from small to large, heavily depend on databases for their operations. No widely accepted exact definition exists for DBMS. However, a system needs to provide considerable functionality to qualify as a DBMS. Accordingly its supported data collection needs to meet respective usability requirements (broadly defined by the requirements below) to qualify as a database. Thus, a database and its supporting DBMS are defined here by a set of general requirements listed below. Virtually all existing mature DBMS products meet these requirements to a great extent, while less mature either meet them or converge to meet them. Evolution of database and DBMS technology
The introduction of the term database coincided with the availability of direct-access storage (disks and drums) from the mid-1960s onwards. The term represented a contrast with the tape-based systems of the past, allowing shared interactive use rather than daily batch processing. In the earliest database systems, efficiency was perhaps the primary concern, but it was already recognized that there were other important objectives. One of the key aims was to make the data independent of the logic of application programs, so that the same data could be made available to different applications. In the period since the 1970s database technology has kept pace with the increasing resources becoming available from the computing platform: notably the rapid increase in affordable capacity and speed of disk storage, and of main memory. This has enabled ever larger databases and higher throughput to be achieved. The first generation of general-purpose database systems were navigational, applications typically accessed data by following pointers from one record to another. The two main data models at this time were the hierarchical model, epitomized by IBM’s IMS system, and the Codasyl model (Network model), implemented in a number of products such as IDMS.
The relational model, first proposed in 1970 by Edgar F. Codd, departed from this tradition by insisting that applications should search for data by content, rather than by following links. This was considered necessary to allow the content of the database to evolve without constant rewriting of links and pointers. The relational model is made up of ledger-style tables, each used for a different type of entity. Data may be freely inserted, deleted and edited in these tables, with the DBMS doing whatever maintenance needed to present a table view to the application/user. The relational part comes from entities referencing other entities in what is known as one-to-many relationship, like a traditional hierarchical model, and many-to-many relationship, like a navigational (network) model. Thus, a relational model can express both hierarchical and navigational models, as well as its native tabular model, allowing for pure or combined modeling in terms of these three models, as the application requires. The earlier expressions of the relational model did not make relationships between different entities explicit in the way practitioners were used to back then, but as primary keys and foreign keys.
These keys, though, can be also seen as pointers in their own right, stored in tabular form. This use of keys rather than pointers conceptually obscured relations between entities, at least the way it was presented back then. Thus, the wisdom at the time was that the relational model emphasizes search rather than navigation, and that it was a good conceptual basis for a query language, but less well suited as a navigational language. As a result, another data model, the entity-relationship model which emerged shortly later (1976), gained popularity for database design, as it emphasized a more familiar description than the earlier relational model. Later on, entity-relationship constructs were retrofitted as a data modeling construct for the relational model, and the difference between the two have become irrelevant.
Earlier relational system implementations lacked the sophisticated automated optimizations of conceptual elements and operations versus their physical storage and processing counterparts, present in modern DBMSs (Database Management Systems), so their simplistic and literal implementations placed heavy demands on the limited processing resources at the time. It was not until the mid 1980s that computing hardware became powerful enough to allow relational systems (DBMSs plus applications) to be widely deployed. By the early 1990s, however, relational systems were dominant for all large-scale data processing applications, and they remain dominant today (2012) except in niche areas. The dominant database language is the standard SQL for the Relational model, which has influenced database languages for other data models. The rigidity of the relational model, in which all data are held in related tables with a fixed structure of rows and columns, has increasingly been seen as a limitation when handling information that is richer or more varied in structure than the traditional ‘ledger-book’ data of corporate information systems.
These limitations come to play when modeling document databases, engineering databases, multimedia databases, or databases used in the molecular sciences. Most of that rigidity, though, is due to the need to represent new data types other than text and text-alike within a relational model. Examples of unsupported data types are: •graphics (and operations such as pattern-matching and OCR) •Multidimensional constructs such as 2D (geographical), 3D (geometrical), and multidimensional hypercube models (data analysis). •XML (an hierarchical data modeling technology evolved from EDS and HTML), used for data interchange among dissimilar systems. More fundamental conceptual limitations came with Object Oriented methodologies, with their emphasis on encapsulating data and processes (methods), as well as expressing constructs such as events or triggers. Traditional data modeling constructs emphasize the total separation of data from processes, though modern DBMS do allow for some limited modeling in terms of validation rules and stored procedures. Various attempts have been made to address this problem, many of them banners such as post-relational or NoSQL. Two developments of note are the object database and the XML database. The vendors of relational databases have fought off competition from these newer models by extending the capabilities of their own products to support a wider variety of data types.
A DBMS has evolved into a complex software system and its development typically requires thousands of person-years of development effort. Some general-purpose DBMSs, like Oracle, Microsoft SQL Server, FoxPro, and IBM DB2, have been undergoing upgrades for thirty years or more. General-purpose DBMSs aim to satisfy as many applications as possible, which typically makes them even more complex than special-purpose databases. However, the fact that they can be used “off the shelf”, as well as their amortized cost over many applications and instances, makes them an attractive alternative (Vs. one-time development) whenever they meet an application’s requirements. Though attractive in many cases, a general-purpose DBMS is not always the optimal solution: When certain applications are pervasive with many operating instances, each with many users, a general-purpose DBMS may introduce unnecessary overhead and too large “footprint” (too large amount of unnecessary, unutilized software code).
Such applications usually justify dedicated development. Typical examples are email systems, though they need to possess certain DBMS properties: email systems are built in a way that optimizes email messages handling and managing, and do not need significant portions of a general-purpose DBMS functionality. A major purpose of a database system is to provide users with an abstract view of data. That is the system hides certain details of how the data are stored and maintained as stated by Abraham Silberschatz, Database System Concepts (1999). A database is an organized collection of facts and information. An organizations database can contain facts and information on customers, employees, inventory, competitors, sales information and much more. Most Managers and executive believe a database is one of the most valuable and important parts of a computer-based Information System in accordance with Ralph M. Stair’s Fundamentals of Information System (2001).
According to Wiki.org Program (often shortened to programming, scripting, or coding) is the process of designing, writing, testing, debugging, and maintaining the source code of computer programs. This source code is written in one or more programming languages (such as C++, C#, Java, Python, Smalltalk, etc.). The purpose of programming is to create a set of instructions that computers use to perform specific operations or to exhibit desired behaviors. The process of writing source code often requires expertise in many different subjects, including knowledge of the application domain, specialized algorithms and formal logic. Within software engineering, programming (the implementation) is regarded as one phase in a software development process. There is an ongoing debate on the extent to which the writing of programs is an art form, a craft, or an engineering discipline. In general, good programming is considered to be the measured application of all three, with the goal of producing an efficient and evolvable software solution (the criteria for “efficient” and “evolvable” vary considerably).
The discipline differs from many other technical professions in that programmers, in general, do not need to be licensed or pass any standardized (or governmentally regulated) certification tests in order to call themselves “programmers” or even “software engineers.” Because the discipline covers many areas, which may or may not include critical applications, it is debatable whether licensing is required for the profession as a whole. In most cases, the discipline is self-governed by the entities which require the programming, and sometimes very strict environments. However, representing oneself as a “Professional Software Engineer” without a license from an accredited institution is illegal in many parts of the world. Another ongoing debate is the extent to which the programming language used in writing computer programs affects the form that the final program takes. This debate is analogous to that surrounding the Sapir–Whorf hypothesis in linguistics and cognitive science, which postulates that a particular spoken language’s nature influences the habitual thought of its speakers. Different language patterns yield different patterns of thought. This idea challenges the possibility of representing the world perfectly with language, because it acknowledges that the mechanisms of any language condition the thoughts of its speaker community.
HISTORY OF PROGRAM
Ancient cultures had no conception of computing beyond simple arithmetic. The only mechanical device that existed for numerical computation at the beginning of human history was the abacus, invented in Sumeria circa 2500 BC. Later, the Antikythera mechanism, invented sometime around 100 AD in ancient Greece, was the first mechanical calculator utilizing gears of various sizes and configuration to perform calculations, which tracked the metonic cycle still used in lunar-to-solar calendars, and which is consistent for calculating the dates of the Olympiads. The Kurdish medieval scientist Al-Jazari built programmable Automata in 1206 AD. One system employed in these devices was the use of pegs and cams placed into a wooden drum at specific locations, which would sequentially trigger levers that in turn operated percussion instruments.
The output of this device was a small drummer playing various rhythms and drum patterns. The Jacquard Loom, which Joseph Marie Jacquard developed in 1801, uses a series of pasteboard cards with holes punched in them. The whole pattern represented the pattern that the loom had to follow in weaving cloth. The loom could produce entirely different weaves using different sets of cards. Charles Babbage adopted the use of punched cards around 1830 to control his Analytical Engine. The first computer program was written for the Analytical Engine by mathematician Ada Lovelace to calculate a sequence of Bernoulli numbers. The synthesis of numerical calculation, predetermined operation and output, along with a way to organize and input instructions in a manner relatively easy for humans to conceive and produce, led to the modern development of computer programming. Development of computer programming accelerated through the Industrial Revolution.
According to Barbara Krueger What makes the production of my work so expensive? The whole installation thing – the construction, the objects, the technology. It really adds up. Fixed or semi-fixed location of a complete system or a self-contained unit, with its accompanying assemblies, accessories and parts. Installation generally also includes provision of or connection to services (such as power and water supply) required to make the installed equipment ready for operation.
HISTORY OF INSTALLATION
The “Installation History” tab lists the history of provisioning operations performed on the system. Here previous configurations can be identified as previous states of the system. A state/configuration can be simply identified by the set of installed features. When you perform a provisioning operation such as installing/uninstalling of features, system state/configuration change occurs.
DEFINITION OF TERMS
•SYSTEM – any organized assembly of resources and procedures united and regulated by interaction or interdependence to accomplish a set of specific functions. •SOFTWARE – a collection of computer programs and related data that provide instructions on telling the computer what to do and how to do it. •DATABASE – a system intended to organize, store and retrieve large amount of data easily. •PROGRAM – (also a software program) is a set of instructions written to perform a specified task for a computer. •INSTALLATION – (or setup) of a program (includes drivers, plug-in, etc.) is the act of putting the program onto a computer system so that it can be executed.