Unit #1 Overview

Learning Obectives

Lesson 1 - Overview of the Internet


The Ever Expanding Network

In 1969, the United States Department of Defense funded the creation of ARPANET (the Advanced Research Projects Agency Network),
which connected computing centers run by government departments and universities.

These Institutions wanted to connect their individual networks fo large-scale information transfer.
However, many of them followed different standards and implementations.
In the 1970's the Transmission Control Protocol (TCP) and Internet Protocol (IP),
were developed to provide standards around the transfer of data which would allow these networks to communicate with each other.
When this was implemented in the 1980's, institutions began to adopt these standards, forming the early internet.


The World Wide Web

While people today often use the terms internet and world wide web interchangeably, they actually refer to verry different things.
The internet refers to the actual network of interconnected computing devices. Whereas the World Wide Web refers to the collection of websites
and web-resources originally invented by Tim Bernes Lee in the 1990's. These advances, in conjunction with the advent of web browsers, gave the internet a user-friendly interface
leading to mass-adoption across the 1990's and 2000's.


Browsers and Servers

As we've seen, the internet is a network that links devices world wide, enabling people to share information over vast distances.
But how is information sent from one device to another?
One way of understanding this process is to look at the client-server model.
In this model, client refers to the users device or program making the request for information, and the server refers to the device which hosts files and awaits requests.

First, the client's browser takes the web address provided and makes a GET request to the server at that address. This is a request which retrieves the desired information.
Once the server receives this request, it identifies the requisite files specifieed by the client and sends this data back to the client.

The browser and server communicate with each other using the internet protocol "HTTP" or hyper text transfer protocol.
Each HTTP request uses a request method, which specifies the type of request being made, like the aforementioned GET request.

Request method Explaination
GET Requests data from a server
POST Sends data back to the server
PUT Replace data with updated information
DELETE Deletes information


404 Status Code

When a server responds to a client request, the server specifies a status code. Status codes indicate weather the HTTP request was successfully compleated, or if there was an error
(they can contain information about the type of error that has occured). The status code helps the browser to know how to respond to the data that has been sent back in response.


How do Browsers Work?

Every time a webpage loads, our device sends a request for each file that makes up that page.
So even if we're just loading 1 page, the client might send multiple requests for data, such as images, videos, or audio.
Given this, how does this process work when making multiple requests simultaniously?
The following happens (potentially multiple times) in a split second:
1. When the user types a web address, the client sends a GET request and the server processes this request, sending back the requested HTML data.
2. The browser searches through the HTML document, and makes additional requests for various data specified in this document (media, JavaScript, CSS, ect...).
In most modern browsers these requests are made in parralel, minimizing loading times.


Web 2.0

The earliest static websites were composed of text, images, and links with verry little interactivity beyond browsing from one webpage to another.
These websites are called static because they don't change based on user behaviour.
As internet usage became more widespread, increasingly complex interactions became available.

A collection of advances in the early 2000's created a cluster of web applications that are often called "Web 2.0".
In comparison to early static websites, web 2.0 applications could update specific sections of a webpage to minimize loading times.

Web 2,0 applications are defined by:
- Providing dynamic user experience that responds to user input without reloading the entire page, and
- Emphasizing user generated content and social sharing of information.

There were importent technical innovations which enabled this change to the internet's user-interface, Such as:
- Jquery, The first JavaScript framework that could fetch data while a webpage was still running, and
- the rise of web-frameworks that connected to databases, allowing user genetated content to be effectively stored, created, and displayed.


Current Trends in Internet Usage

The rise of internet connected smartphones has profound changed how users interact with the internet.
Mobile traffic now accounts for more than half of all internet usage globally, and web development practices have evolved with this trend.
The rise of responsive web design has been enabled by advances such as, the inclusion of media queries and relative units in CSS. Allowing pages to dynamically adjust to the size of a display.
While most mobile applicationsdon't form part of the web, (they are designed to keep the user's attention on the app, not to facilitate browsing)
web development can be a great place to start as it's increasingly common to see apps built with a JavaScript framework.