- focused and purposeful (so not recreational browsing),
- uses internet information or internet-based resources (like internet discussion forums),
- tends towards the immediate (drawing answers from information you can access without delay)
- and tends to access information without a purchase price.
Internet research has had a profound impact on the way ideas are formed and knowledge is created. Common applications of Internet research include personal research on a particular subject (something mentioned on the news, a health problem, etc.), students doing research for academic projects and papers, and journalists and other writers researching stories.
Research is a broad term. Here, it is used to mean “looking something up (on the Web)”. It includes any activity where a topic is identified, and an effort is made to actively gather information for the purpose of furthering understanding. It may include some post-collection analysis like a concern for quality or synthesis.
For example, on the Net, the Web can be searched and typically hundreds or thousands of pages can be found with some relation to the topic, within seconds. In addition, email (including mailing lists), online discussion forums (aka message boards, BBS‘s), and other personal communication facilities (instant messaging, IRC, newsgroups, etc.) can provide direct access to experts and other individuals with relevant interests and knowledge.
So defined, Internet research is distinct from library research (focusing on library-bound resources) and commercial database research (focusing on commercial databases). While many commercial databases are delivered through the internet, and some libraries purchase access to library databases on behalf of their patrons, searching such databases is generally not considered part of “Internet research”. It should also be distinguished from scientific research (research following a defined and rigorous process) carried out on the Internet, from straightforward retrieving of details like a name or phone number, and from research about the Internet.
Internet research has strengths and weaknesses. Strengths include speed, immediacy, and a complete disregard for physical distance. The quality of research can be superior to other forms of research but usually is not. Weaknesses include unrecognized bias, difficulties in verifying a writer’s credentials (and therefore the accuracy or pertinence of the information obtained) and whether the searcher has sufficient skill to draw meaningful results from the abundance of material typically available. The first resources retrieved may not be the most suitable resources to answer a particular question. For example, popularity is often a factor used in structuring internet search results but popular information is not always most correct or representative of the breadth of knowledge and opinion on a topic.
While conducting commercial research fosters a deep concern with costs, and library research fosters a concern with access, internet research fosters a deep concern for quality, managing the abundance of information and with avoiding unintended bias. This is partly because Internet research occurs in a less mature information environment: an environment with less sophisticated / poorly communicated search skills and much less effort in organizing information. Library and commercial research has many search tactics and strategies unavailable on the internet and the library and commercial environments invest more deeply in organizing and vetting their information.
The most popular search tools for finding information on the internet include Web search engines, meta search engines, Web directories, and specialty search services. A Web search engine uses software known as a Web crawler to follow the hyperlinks connecting the pages on the World Wide Web. The information on these Web pages is indexed and stored by the search engine. To access this information, a user enters keywords in a search form and the search engine queries its indices to find Web pages that contain the keywords and displays them in search engine results page (RESP). The RESP list typically includes hyperlinks and brief descriptions of the content found. Search results are ranked using complex algorithms, which take into consideration the location and frequency of keywords on a Web page, along with the quality and number of external hyperlinks pointing at the Web page.
A Meta search engine enables users to enter a search query once and have it run against multiple search engines simultaneously, creating a list of aggregated search results. Since no single search engine covers the entire web, a meta search engine can produce a more comprehensive search of the web. Most meta search engines automatically eliminate duplicate search results. However, meta search engines have a significant limitation because the most popular search engines, such as Google, are not included because of legal restrictions.
A Web directory organizes subjects in a hierarchical fashion that lets users investigate the breadth of a specific topic and drill down to find relevant links and content. Web directories can be assembled automatically by algorithms or handcrafted. Human-edited Web directories have the distinct advantage of higher quality and reliability, while those produced by algorithms can offer more comprehensive coverage. The scope of Web directories are generally broad, such as DOZ, Yahoo! and The WWW Virtual Library, covering a wide range of subjects, while others focus on specific topics.
Specialty search tools enable users to find information that conventional search engines and meta search engines cannot access because the content is stored in databases. In fact, the vast majority of information on the web is stored in databases that require users to go to a specific site and access it through a search form. Often, the content is generated dynamically. As a consequence, Web crawlers are unable to index this information. In a sense, this content is “hidden” from search engines, leading to the term invisible or deep Web. Specialty search tools have evolved to provide users with the means to quickly and easily find deep Web content. These specialty tools rely on advanced bot and intelligent agent technologies to search the deep Web and automatically generate specialty Web directories, such as the Virtual Private Library.
When using the Internet for research, countless websites appear for whatever search query is entered. Each of these sites has one or more authors or associated organizations. Who authored or sponsored a website is very important to the accuracy and reliability of the information presented on the website.
While it is not imperative that the authorship be determined for every website during Internet research, who authored or sponsored a website is essential knowledge when one cares about the accuracy and reliability of the information, bias, and/or web safety. For example, a website about civil rights that is authored by a member of an extremist group most likely will not contain accurate or unbiased information.
The author or sponsoring organization of a website may be found in several ways. Sometimes the author or organization can be found at the bottom of the website home page. Another way is by looking in the ‘Contact Us’ section of the website. It may be directly listed, determined from the email address, or by emailing and asking. If the author’s name or sponsoring organization cannot be determined, one should question the trustworthiness of the website. If the author’s name or sponsoring organization is found, a simple Internet search can provide information that can be used to determine if the website is reliable and unbiased.
Web design encompasses many different skills and disciplines in the production and maintenance of websites. The different areas of web design include web graphic design; interface design; authoring, including standardised code and proprietary software; user experience design; and search engine optimization. Often many individuals will work in teams covering different aspects of the design process, although some designers will cover them all. The term web design is normally used to describe the design process relating to the front-end (client side) design of a website including writing mark up. Web design partially overlaps web engineering in the broader scope of web development. Web designers are expected to have an awareness of usability and if their role involves creating mark up then they are also expected to be up to date with web accessibility guidelines.
Although web design has a fairly recent history, it can be linked to other areas such as graphic design. However web design is also seen as a technological standpoint. It has become a large part of people’s everyday lives. It is hard to imagine the Internet without animated graphics, different styles of typography, background and music.
The start of the web and web design
Evolution of web design
In 1996, Microsoft released its first competitive browser, which was complete with its own features and tags. It was also the first browser to support style sheets, which at the time was seen as an obscure authoring technique. The HTML markup for tables was originally intended for displaying tabular data. However designers quickly realized the potential of using HTML tables for creating the complex, multi-column layouts that were otherwise not possible. At this time, as design and good aesthetics seemed to take precedence over good mark-up structure, and little attention was paid to semantics and web accessibility. HTML sites were limited in their design options, even more so with earlier versions of HTML. To create complex designs, many web designers had to use complicated table structures or even use blank spacer .GIF images to stop empty table cells from collapsing. CSS was introduced in December 1996 by the W3C to support presentation and layout; this allowed HTML code to be semantic rather than both semantic and presentational, and improved web accessibility, see tableless web design.
End of the first browser wars
During 1998 Netscape released Netscape Communicator code under an open source licence, enabling thousands of developers to participate in improving the software. However, they decided to stop and start from the beginning, which guided the development of the open source browser and soon expanded to a complete application platform. The Web Standards Project was formed, and promoted browser compliance with HTML and CSS standards by creating Acid1, Acid2, and Acid3 tests. 2000 was a big year for Microsoft. Internet Explorer had been released for Mac, this was significant as it was the first browser that fully supported HTML 4.01 and CSS 1, raising the bar in terms of standards compliance. It was also the first browser to fully support the PNG image format. During this time Netscape was sold to AOL and this was seen as Netscape’s official loss to Microsoft in the browser wars.
Since the start of the 21st century the web has become more and more integrated into peoples lives, as this has happened the technology of the web has also moved on. There have also been significant changes in the way people use and access the web, and this has changed how sites are designed.
Since the end of the browsers wars there have been new browsers coming onto the scene. Many of these are open source meaning that they tend to have faster development and are more supportive of new standards. The new options are considered by many to be better that Microsoft’s Internet Explorer.
Tools and technologies
Web designers use a variety of different tools depending on what part of the production process they are involved in. These tools are updated over time by newer standards and software but the principles behind them remain the same. Web graphic designers use vector and raster graphics packages for creating web formatted imagery or design prototypes. Technologies used for creating websites include standardised mark-up, which could be hand-coded or generated by WYSIWYG editing software. There is also proprietary software based on plug-ins that bypasses the client’s browsers versions. These are often WYSIWYG but with the option of using the software’s scripting language. Search engine optimisation tools may be used to check search engine ranking and suggest improvements.
Skills and techniques
Marketing and communication design
Marketing and communication design on a website may identify what works for its target market. This can be an age group or particular strand of culture; thus the designer may understand the trends of its audience. Designers may also understand the type of website they are designing, meaning, for example, that (B2B) business-to-business website design considerations might differ greatly from a consumer targeted website such as a retail or entertainment website. Careful consideration might be made to ensure that the aesthetics or overall design of a site do not clash with the clarity and accuracy of the content or the ease of web navigation,  especially on a B2B website. Designers may also consider the reputation of the owner or business the site is representing to make sure they are portrayed favourably.
User experience design and interactive design
Users understanding the content of a website often depends on users understanding how the website works. This is part of the user experience design. User experience is related to layout, clear instructions and labeling on a website. How well a user understands how they can interact on a site may also depend on the interactive design of the site. If a user perceives the usefulness of that website, they are more likely to continue using it. Users who are skilled and well versed with website use may find a more unique, yet less intuitive or less user-friendly website interface useful nonetheless. However, users with less experience are less likely to see the advantages or usefulness of a less intuitive website interface. This drives the trend for a more universal user experience and ease of access to accommodate as many users as possible regardless of user skill. Much of the user experience design and interactive design are considered in the user interface design.
Advanced interactive functions may require plug-ins if not advanced coding language skills. Choosing whether or not to use interactivity that requires plug-ins is a critical decision in user experience design. If the plug-in doesn’t come pre-installed with most browsers, there’s a risk that the user will have neither the know how, nor the patience to install a plug-in just to access the content. If the function requires advanced coding language skills, it may be too costly in either time or money to code compared to the amount of enhancement the function will add to the user experience. There’s also a risk that advanced interactivity may be incompatible with older browsers or hardware configurations. Publishing a function that doesn’t work reliably is potentially worse for the user experience than making no attempt. It depends on the target audience if it’s likely to be needed or worth any risks.
Part of the user interface design is affected by the quality of the page layout. For example, a designer may consider if the sites page layout should remain consistent on different pages when designing the layout. Page pixel width may also be considered vital for aligning objects in the layout design. The most popular fixed-width websites generally have the same set width to match the current most popular browser window, at the current most popular screen resolution, on the current most popular monitor size. Most pages are also center-aligned for concerns of aesthetics on larger screens.
Fluid layouts increased in popularity around 2000 as an alternative to HTML-table-based layouts and grid-based design in both page layout design principle, and in coding technique, but were very slow to be adopted.[note 1] This was due to considerations of screen reading devices and windows varying in sizes which designers have no control over. Accordingly, a design may be broken down into units (sidebars, content blocks, embedded advertising areas, navigation areas) that are sent to the browser and which will be fitted into the display window by the browser, as best it can. As the browser does recognize the details of the reader’s screen (window size, font size relative to window etc.) the browser can make user-specific layout adjustments to fluid layouts, but not fixed-width layouts. Although such a display may often change the relative position of major content units, sidebars may be displaced below body text rather than to the side of it. This is a more flexible display than a hard-coded grid-based layout that doesn’t fit the device window. In particular, the relative position of content blocks may change while leaving the content within the block unaffected. This also minimizes the user’s need to horizontally scroll the page.
Responsive Web Design is a newer approach, based on CSS3, and a deeper level of per-device specification within the page’s stylesheet through an enhanced use of the CSS
Web designers may choose to limit the variety of website typefaces to only a few which are of a similar style, instead of using a wide range of typefaces or type styles. Most browsers recognize a specific number of safe fonts, which designers mainly use in order to avoid complications.
Font downloading was later included in the CSS3 fonts module and has since been implemented in Safari 3.1, Opera 10 and Mozilla Firefox 3.5. This has subsequently increased interest in web typography, as well as the usage of font downloading.
Most layouts on a site incorporate negative space to break the text up into paragraphs and also avoid center-aligned text. 
The page layout and user interface may also be affected by the use of motion graphics. The choice of whether or not to use motion graphics may depend on the target market for the website. Motion graphics may be expected or at least better received with an entertainment-oriented website. However, a website target audience with a more serious or formal interest (such as business, community, or government) might find animations unnecessary and distracting if only for entertainment or decoration purposes. This doesn’t mean that more serious content couldn’t be enhanced with animated or video presentations that is relevant to the content. In either case, motion graphic design may make the difference between more effective visuals or distracting visuals.
Quality of code
Website designers may consider it to be good practice to conform to standards. This is usually done via a description specifying what the element is doing. Failure to conform to standards may not make a website unusable or error prone, but standards can relate to the correct layout of pages for readability as well making sure coded elements are closed appropriately. This includes errors in code, more organized layout for code, and making sure IDs and classes are identified properly. Poorly-coded pages are sometimes colloquially called tag soup. Validating via W3C can only be done when a correct DOCTYPE declaration is made, which is used to highlight errors in code. The system identifies the errors and areas that do not conform to web design standards. This information can then be corrected by the user.
Further jobs, which under particular circumstances may become involved during the creation of a website include:
- Graphic designers to create visuals for the site such as logos, layouts and buttons
- Internet marketing specialists to help maintain web presence through strategic solutions on targeting viewers to the site, by using marketing and promotional techniques on the internet
- SEO writers to research and recommend the correct words to be incorporated into a particular website and make the website more accessible and found on numerous search engines
- Internet copywriter to create the written content of the page to appeal to the targeted viewers of the site
- User experience (UX) designer incorporates aspects of user focused design considerations which include information architecture, user centered design, user testing, interaction design, and occasionally visual design.