Software Engineering – Reason and a Concept!

Some decades back, when computer was just born and was completely new thing to people. Very few people could operate them and software was not given very much of emphasis. That time hardware was the most important part that decided the cost of implementation and success rate of the system developed. Very few people were known to programming. Computer programming was very much considered to be an art gifted to few rather than skill of logical thinking. This approach was full of risk and even in most of the cases, the system that was undertaken for development, never met the completion. Soon after that, some emphasis was given towards software development. This started a new era of software development. Slowly, people started giving more importance to software development.

People, who wrote software, hardly followed any methodology, approach or discipline that would lead to a successful implementation of a bug-free and fully functional system. There hardly existed any specific documentation, system design approach and related documents etc. These things were confined to only those who developed hardware systems. Software development plans and designs were confined to only concepts in mind.

Even after number of people jumped in this field, because of the lack of proper development strategies, documentations and maintenance plans, the software system that was developed was costlier than before, it took more time to develop the entire system (even sometimes, it was next to impossible to predict the completion date of the system that was under development), the lines of codes were increased to a very large number increasing the complexity of the project/software, as the complexity of the software increased it also increased the number of bugs/problems in the system. Most of the time, the system that was developed, was unusable by the customer because of problems such as late delivery (generally very very very late) and also because of number of bugs, there were no plans to deal with situations where in the system was needed to be maintained, this led to the situation called ‘Software Crisis’. Most of software projects, which were just concepts in brain but had no standard methodologies, practices to follow, experienced failure, causing loss of millions of dollars.

‘Software Crisis’ was a situation, which made people think seriously about the software development processes, and practices that could be followed to ensure a successful, cost-effective system implementation, which could be delivered on time and used by the customer. People were compelled to think about new ideas of systematic development of software systems. This approach gave birth to the most crucial part of the software development process, this part constituted the most modern and advanced thinking and even the basics of any project management, it needed the software development process be given an engineering perspective thought. This approach is called ‘Software Engineering’.

Standard definition of ‘Software Engineering’ is ‘the application of systematic, disciplined, quantifiable, approach to the development, operation and maintenance of software i.e. the application of engineering to software.’

The Software Engineering subject uses a systematic approach towards developing any software project. It shows how systematically and cost-effectively a software project can be handled and successfully completed assuring higher success rates. It includes planning and developing strategies, defining time-lines and following guidelines in order to ensure the successful completion of particular phases, following predefined Software Development Life-Cycles, using documentation plans for follow-ups etc. in order to complete various phases of software development process and providing better support for the system developed.

Software Engineering takes an all-round approach to find out the customer’s needs and even it asks customers about their opinions hence proceeding towards development of a desired product. Various methodologies/practices such as ‘Waterfall Model’, ‘Spiral Model’ etc. are developed under Software Engineering which provides guidelines to follow during software development ensuring on time completion of the project. These approaches help in dividing the software development process into small tasks/phases such as requirement gathering and analysis, system design phase, coding phase etc. that makes it very much easy to manage the project. These methods/approaches also help in understanding the problems faced (which occur during the system development process and even after the deployment of the system at customer’s site) and strategies to be followed to take care of all the problems and providing a strong support for the system developed (for example: the problems with one phase are resolved in the next phase, and after deployment of the product, problems related to the system such as queries, bug that was not yet detected etc. which is called support and maintenance of the system. These all strategies are decided while following the various methodologies).

Hyper-Threading Technology

We all want our computers to be as speedy as they can be. There are many different ways to increase computer performance through different types of upgrades. Processors have become speedier because of demand and competition. To make processors fast, chipmakers have been creating new CPU architectures to process information and milk every ounce of processing power available. Intel created Hyper-Threading technology as an upgrade in CPU architecture and quietly integrated it into some of their processors for development and testing purposes.

It is based on the idea of simultaneous multi-threading technology (SMT), where multiple physical CPUs are used to process multiple threads at once. As an alternative to using multiple physical processors, Intel created multiple logical processors inside a single physical CPU. Intel recognized that CPUs are inherently inefficient and have lots of computing power that never gets used.

It allows multi-threaded software applications to execute threads in parallel. Consequently, resource utilization provides higher processing throughput. It is basically a more superior form of Super-threading that was first introduced on the Intel Xeon processors and was later added to Pentium 4 processors. This type of threading technology was not present in general-purpose microprocessors.

To boost performance, threading was allowed in the software by splitting instructions into multiple streams so that multiple processors could act upon them. By using this technology, processor-level threading can be utilized which provides more efficient use of resources for greater parallelism and improved performance on today’s multi-threaded software.

Hyper-Threading is a multi-threading technology in which SMT is achieved by duplicating the architectural state on each processor, while sharing one set of processor execution resources. It also produces faster response times for a multi-tasking workload environment. By permitting the processor to use on-die resources that would otherwise have been idle, it offers a performance boost on multi-threading and multi-tasking operations for the microarchitecture.

In a CPU, every clock cycle has the ability to do one or more operations at a time. One processor can only handle so much during an individual clock cycle. Hyper-Threading permits a single physical CPU to fool an operating system, capable of SMT operations, into thinking there are two processors.

It produces logical processors to handle multiple threads in the same time slice, where a single physical processor would normally only be able to handle a single operation. There are some prerequisites that must be satisfied before taking advantage of this technology. The first prerequisite is that you must have a Hyper-Threading enabled processor, HT enabled chipset, BIOS and operating system. Further, your operating system must support multiple threads. Finally, the number and types of applications being used make a difference on the increase in performance as well.

Hyper-Threading is a hardware upgrade that makes use of the wasted power of a CPU, but it also helps the operating system and applications to run more efficiently, to do more at once. There are millions of transistors inside a CPU that turn on and off to process commands.

By adding more transistors, chipmakers typically add more brute force computing power. More transistors equal a large CPU and more heat. The technology is aimed at increasing performance, without significantly increasing the number of transistors contained on the chip, making the CPU footprint smaller.

It offers two logical processors in one physical package. Each logical processor must share external resources like memory, hard disk, etc. and must also use the same physical processor for computations. The performance boost will not scale the same way as a true multiprocessor architecture, because of the shared nature of Hyper-Threading processors. System performance will be somewhere between that of a single CPU without Hyper-Threading and a multi-processor system with two comparable CPUs.

This technology is independent of any platform. Some applications are already multi-threaded and will automatically benefit from this technology. Multi-threaded applications take full benefits of the increased performance that this technology has to offer, permitting users to see immediate performance gains when multitasking. It also improves reaction and response time, and increased number of users a server can support. Today’s multi-processing software programs are compatible with Hyper-Threading technology enabled platforms, but further performance gains can only be realized by specifically tuning the software to utilize it. For future software optimization and business growth, this technology complements traditional multi-processing by providing additional headroom.