Learning Computer Programming Terminology

Computer programmers use a range of terminologies that make sense only to them and to their colleagues. For other people who are not very adept with computers and programming, those words either mean something else or don’t mean anything at all. Here’s an overview of the ten most commonly used and confused computer programming terminologies that everybody should know, explained in layman’s terms:

1. 1GL
1GL means First Generation Programming Language. It is actually a machine level programming language that is written in one’s and zero’s. These are the programs that can be understood by the computer’s central processing unit directly. There’s no need to run it into a compiler or any other programming software. There are also second, third, up to fourth generation computer programming language.

2. Abandon ware
Abandon ware are software that are not being used, supported, or sold by its respective publisher any longer. These types of software cannot be resold or redistributed to end-users unless otherwise given away as a freeware.

3. ActiveX
ActiveX are programs using the Visual Basic programming language. It is basically created for active contents of the software that are commonly used by other applications that are already installed or existing on the computer. Sometimes, it even shares the ones used by the operating system itself. ActiveX is the one responsible for the interactive features of web-based programs.

4. Beta Version
When a software, a website, or any application is tagged as a beta release or beta version, it means that version is its first ever launch. It is the program that incorporates the very basic features and requirements of the software. At that point, the software is not yet tested for bugs. The beta version of the software is normally its test run.

5. DirectX
DirectX is a program provided by Microsoft. It is a collection of different multimedia applications and programming interfaces. It is built into the Windows operating system. DirectX is mostly used in gaming applications.

6. Emulation
Emulation is the term given to the capacity of a certain program to imitate another device or program. There are different degrees of emulation. Emulators are usually created for arcade game devices so it can be played on a desktop computer.

7. Firmware
Firmware refers to the software that is implanted in a device or hardware. The firmware usually contains a set of protocols that the hardware has to run so it would work smoothly with the other devices that it is attached to.

8. HTML
HTML is hypertext markup language. It is the language used by web pages. It is a programming language that has a way to describe the different parts of the text created. It can denote text used as paragraphs, text, headings, and lists.

9. Java
Java is an object oriented programming language. Sun Microsystems developed the software and it can be said that it resembles the C++ program. However, it was structured to avoid the flaws of the latter. This is widely used in the world-wide web as well.

10. Linux
Linux is an example of operating software in the likes of Windows and Mac Os. But unlike these two, Linux is a free open source program. Anybody can use the software for redistribution and modification.

These are just some of the computer programming terminologies you should know so that the next time you come across them, you won’t be guessing what they means. Programming can be a complex task but if you learn the basic things like terminologies, you will surely find the subject interesting.

Advertisements

When Is the Right Time to Redesign?

If you run a website, chances are you often wonder whether it is the right time to do a total redesign of the layout of your website. Here are some points to consider:

Are you thinking of a redesign just for the sake of it? If you answered yes to that question, it is not yet the right time to do a redesign. Remember, a design serves a specific purpose. If you are not sure whether to do an overhaul of your site, keep in mind that your current design might have a specific purpose that you might not know about. You will lose that function if you do a redesign.

On the other hand, if your website has had the same website design since 1990, perhaps it is high time to do a redesign. The last thing you would ever want to happen to your site is when visitors leave your site without taking a look at your content just because the design is old-fashioned. If this is your case, here are some points to ponder before doing a redesign.

Redesigning your website is like performing plastic surgery on it. Your website loses its current identity (for the better or worse) and your regular visitors might not recognize your new design at first glance. You risk losing them just because they thought they landed on the wrong page. Hence, it is very  important that you retain a characteristic feature from your old layout. Perhaps it is the logo of your site; perhaps it is the same text style for the title for your site.

To play it safe, put a poll on your site to let your visitors do the talking. If they think it is necessary for the website to have a fresh look, give it to them!

The challenges of Web 2.0 applications

Rich Internet applications allow for dynamic,
asynchronous data transfer, using multiple protocols and
a variety of servers. They gather data from distributed,
heterogeneous sources, including cloud-based and
external data storage options. Thick clients with widgets
and client-side functionality often have server-side
components, which may need additional processing
before the server sends the data back to the client.
Developers who build these widgets—often adding
them from available toolkits—do it on their development
machines and don’t realize that once separated across
the network, the server component may cause latency
and affect the overall system performance.

New technologies such as Ajax enable prefetching,
where every new letter that a user enters into a
search engine suggests a new set of results that are
dynamically delivered from the server. All this activity
generates a lot of network traffic and can significantly
impact performance. Network latency and
bandwidth constraints can also create performance
bottlenecks. To accurately predict the performance
of an application, it is necessary to test individual
components and services, but equally critical are
server monitoring and end-to-end performance testing,
along with accurate WAN emulation.

Testing Web 2.0 applications presents its own set of
challenges.1 The complexity of new technologies, the
lack of commonly recognized and accepted standards,
and the sheer multitude of emerging frameworks
and toolkits make it difficult for companies to build
Web 2.0 testing strategies and select appropriate
automation solutions. Traditional testing tools focus on
protocol-level verification, offering no framework-level
support or ability to accurately recognize objects in
these new, rich clients, making it virtually impossible
to effectively validate the performance of Web 2.0
applications. Script creation, which has always been a
lengthy, time-consuming process that requires domain
and application expertise, becomes even more
complex in Web 2.0 applications.

Impact of new development tools on IT professionals

The IT field has gone through some incredibly fast changes. Advances in technology have been fast to come and in great numbers. These advances have had an impact on developers, which some in the IT field see as positive and others see as negative. Advances in technology and the tools associated with the development process have caused a shift in the way tools used by developers are used and in the methodologies used by developers to manage their work.

One area that has seen change is that of Programming Methodologies. Rapid Application Development, a Programming Methodology in the 1990’s, is today a category of software development tool used to make it easier for developers to respond to today’s increased demand for software solutions.

In the mid 1980, structure was widely present in the work done by programmers. Programming languages such as Cobol required that programmers follow a given order when programming. A was followed by B which preceded C. This logical structure gave programmers a reason to think about their code before they coded it. Programmers used flow charts to create a visual representation of a program before writing any code.

This paper will cover the changes in Programming Methodologies, Development Concepts, and Software, and the impact experienced by IT professionals.

Programming Methodologies

Technological advances resulted in changes in the way IT professionals addressed the demands of the client. Programming methodologies were the first to show these changes. Traditional Methodologies were based on a structured systematic approach to developing systems. This forced developers to “sign-off” after the completion of each specification before development could proceed to the next step. The requirements and design were frozen and the system was coded, tested, and implemented. With such conventional methods, there was a long delay before the customer saw any results and the development process could take so long that the customer’s business could fundamentally change before the system was even ready for use.

This was the case until the early 1990’s when technology guru James Martin introduced a new development methodology, which he called “Rapid Application Development” (Wikipedia contributors, 2010). Rapid Application Development was born out of a need to create applications faster than other methods of the time would allow. Rapid Application Development was a “merger of various structured techniques, especially data-driven Information Engineering, with prototyping techniques to accelerate software systems development” (Wikipedia contributors, 2010).

Reaction against heavyweight methods characterized as heavily regulated, regimented, micro managed, waterfall methods of development (Wikipedia contributors, 2010), lead to what we known today as Agile, a programming methodology that focuses on making the development process even faster than Rapid Application Development did. Responding to increased demand for programming solutions, IT professionals have looked for ways to cut development time resulting in a progressive change in programming methodologies.

Development Concepts

Client/Server technology was in its infancy in 1992 and already there were signs of the impact advances in this technology would have on networks. “The proliferation of networked applications will come with a large burden for network managers.” (Ewald, Roy, 1992) Furthermore, client/server software tools were not available in large numbers and that made “application development cycles longer than necessary.” (Ewald, Roy, 1992)

To make things worse, “software developed with existing tools are not readily reusable.” (Ewald, Roy, 1992)

Reusability gave birth to Object-orientation. Object-Orientation (OO) gave developers the ability to improve deliverables, notations, techniques, and tools used in application development. “Its goal was to encapsulate design decisions with an anthropomorphic design mind-set and objects, classes, inheritance, polymorphism, and dynamic binding as its mechanisms.” (Cockburn, Alistair, 1993) In short, it was meant to provide developers with re-usable objects and therefore cut development time.

Cockburn and Alistair (1993) wrote that “Object-oriented (OO) development is characterized by: the encapsulation of process with data in both the application structure and the development methodology; anthropomorphic design, in which objects in the application are assigned “responsibilities” to carry out; modeling the problem domain throughout development; emphasis on design and code reuse with extensibility; incremental and iterative development.”

Software

Coupe and Onodu (1996) wrote an article in the Journal of Information Technology in which they said, “There is a need to develop new software and upgrade existing systems to meet competitive challenges, to plan effectively, and to manage the day-to-day running of organizations. The pressure is increasing for developers to work more efficiently and produce better quality systems more quickly than before” (Coupe, Onodu, 1996). Using these new technologies to “reduce system development time within IT departments” (Coupe, Onodu, 1996) made it possible for “these systems to be put into operation sooner” (Coupe, Onodu, 1996).

The systems Coupe and Onodu were referring to where Computer aided software engineering (CASE) tools, which at the time were seen as having a positive effect on developer productivity. A survey of developers in UK organizations confirmed the notion finding that CASE tools “improved the reliability and accuracy of applications software” (Coupe, Onodu, 1996). Application development as a tool required continuous fine-tuning to keep it operating at peak efficiency.

These advances in technology lead to other advances that made it possible for it professionals to benefit from them. Such is the case for Web Services, an advance in software development, which prompted the major manufacturers of development software to create “tools that will make it easy for developers” (Dyck, 2001) to do their job.

Application development advances include the creation of tools that expand on the concept of an IDE for developers. Microsoft introduced a research project called Code Canvas, which, is likened to a roadmap of code helping developers understand complexities and changes in code (NetworkWorld.com, 2009). Although still a research project, it shows the moves that technology companies such as Microsoft are making to create an even more visual development environment for developers. This is as an attempt to bridge the gap between developers and designers.

Conclusion

Advances in technology and the tools associated with the development process have placed increased demands on developers to be more efficient and produce better quality systems more quickly than before. This increased demand has given way to changes in programming methodologies as well as programming tools and has introduced new programming concepts that have given developers a way to create objects that can be re-used. Client/Server technologies, Object-Orientation, and Web Services are just a sample of programming concepts that are the direct result of a need for more productive developers.

The impact these advances have had on it professionals has always been a hard issue to address. This reflects, in part, the difficulty in defining and measuring software quality (Ewald, Roy, 1992). Training developers on the use of new technologies can affect how researchers and other IT professionals perceive the impact of these changes. The impact can be positive when training, hardware and software upgrades, are present. On the other hand, a lack of hardware and software upgrades as well as training would give reason to labeling any advance in technology as negative.

It is not within the content of this document to define the impact as negative or positive. Rather, it is up to the readers to come to their own conclusions based on the information made available to you. One conclusion to note is that organizations, educational institutions, and IT professionals tasked with developing solutions need to be aware of how changes in technology affect them. Advances in technology are often times followed by other advances resulting out of a need to improve the tools used by developers to do their jobs. It is not enough to train developers on emerging technologies, it is necessary to provide them with the tools they need to use the emerging technologies.

References

Agile software development. (2010, October 20). In Wikipedia, The Free Encyclopedia. Retrieved from http://en.wikipedia.org/w/index.php?title=Agile_software_development&oldid=391820065

Cockburn, A. A. R. (1993). The impact of object-orientation on application development. IBM Systems Journal, 32(3), 420. Retrieved from http://proquest.umi.com.library.capella.edu/pqdweb?did=547801&Fmt=7&clientId=62763&RQT=309&VName=PQD

Coupe, R. T., & Onodu, N. M. (1996). An empirical evaluation of the impact of CASE on developer productivity and software quality. Journal of Information Technology, 11(2), 173. Retrieved from http://proquest.umi.com.library.capella.edu/pqdweb?did=667612851&Fmt=7&clientId=62763&RQT=309&VName=PQD

Ewald, A., & Roy, M. (1992). The evolution of the Client/Server revolution. Network World, 9(46), 75. Retrieved from http://proquest.umi.com.library.capella.edu/pqdweb?did=677568&Fmt=7&clientId=62763&RQT=309&VName=PQD

Paul Krill, (2009). Microsoft, IBM highlight software development advances. Info World. Retrieved from http://www.networkworld.com/news/2009/070909-microsoft-ibm-highlight-software-development.html

Rapid application development. (2010, October 21). In Wikipedia, The Free Encyclopedia. Retrieved from http://en.wikipedia.org/w/index.php?title=Rapid_application_development&oldid=392053265

Timothy Dyck, (2001). Tools Advance Web Services. EWeek.com. Retrieved from http://www.eweek.com/c/a/Application-Development/Tools-Advance-Web-Services/