Regardless of whether you are a seasoned IT professional or just a mere web surfer, you’ve probably tried to do some sort of comparison between desktop and web applications. Simply speaking, a desktop application is a computer program that runs locally on a computer device, such as desktop or laptop computer, in contrast to a web application, which is delivered to a local device over the Internet from a remote server. Different user environments can impact whether a desktop or a web application is the best solution for your needs.
A Brief Summary of Application Evolution
Computers and software have come a long way since the first digital computers (ABC and ENIAC) were created, back around the start of WWII. To save my readers some time, I will jump forward to more recent times – August 12th 1981, when IBM introduced a PC hardware platform. IBM PCs used a text mode, command-line style operating system known as MS-DOS (which stands for Microsoft Disk Operating System), which eventually was replaced with the graphical Microsoft Windows OS in the 1990s. Then, the World Wide Web (WWW) took off in 1991 and the Mosaic web browser application was announced in 1993. These changes affected our world a great deal.
Although early applications were developed to be run from mainframe computers and accessed via low-tech terminal devices, the increased power and availability of (relatively) powerful desktop computers ushered in an era of standalone desktop applications that were run locally on the PC. Client-server model applications emerged that replaced the mainframe with a server, and allowed the remote client software to assume responsibility for some of the processing tasks. Hardware specifications and broadband speeds continued to improve constantly, which led to corresponding improvements in the quality and quantity of WWW content. Websites became more interactive with the increase of multimedia content and expanded their functionality beyond static web pages. As browsers and development platforms evolved, and more and more people began to use the internet and email, more businesses established their presence in the online world. These businesses leveraged the emerging interactive capabilities of the web to introduce applications that were served directly to a web browser, and these web applications became very popular.
Differences Between Desktop and Web Applications
Desktop applications have traditionally been limited by the hardware on which they are run. They must be developed for and installed on a particular operating system, and may have strict hardware requirements that must be met to ensure that they function correctly. Updates to the applications must be applied by the user directly to their installation, and may require hardware upgrades or other changes in order to work. This hardware dependence, as well as the legacy of mainframe terminal applications, has typically limited the level of complexity in user interfaces for desktop applications.
In some ways, web applications are more reminiscent of the original mainframe applications, or the later client-server model that were common for early desktop business applications. The user accesses the application using the web browser (in effect a stand-in for a client), and works with resources available over the internet, including storage and CPU processing power. This approach allows for “thin clients” (machines with limited hardware capabilities) to provide access to complex applications delivered from a centralized infrastructure. Additionally, the use of existing web browsers and their multimedia capabilities has allowed developers to create more interactive, media-rich user interfaces. Some of these capabilities have been reintroduced to desktop applications as well, but they have been largely driven by the ubiquity of the web and the way in which users have been accustomed to interacting with their computers.
As you can see, each type of application has its own benefits and weaknesses and can be utilized best within its own niche. I believe that both desktop and web applications will continue to coexist for a long time, or at least until the Internet becomes omnipotent and all computers become thin terminals that connect users to their respective digital environments in the global cloud.