Ginbrookes Webs





Web for the uninitiated

posted on

2018-02-17 03:45:12pm


Domain:
A domain name simply put is both a name and address tag. You choose the name; , assuming somebody else has not already got it and also a " top-level part " (TLD) which is the dot org or dot com part.

Effectively when you buy a domain your buying a lease, its yours until you stop renewing it via an annual or more time cycle. A domain name is unique, no-one else can have exactly your domain name. For instance if you choose to buy & have registered somebusiness.com nobody else can have that. However somebody could get somebusiness.co.uk or similar combination.

Web Hosting
Thats where your web content lives and is ready to be served up to the world. But how does the world know or link to your content? Simple thatch where the domain name comes in; you point your domain to the web server where your content is.

With Street addresses you might have Mr & Mrs Smith at say 17, nonaggression St; when the Smith’s move their contents they can’t take that address with them. You can on the web , you can move your web pages to another hosting & everybody will get directed there- you simply edit the IP address on your domain to point to the new hosting server.

You can buy a domain and hosting from the same people often at a discount but sometimes especially in the past you could find that the deal was your domain was tied to the hosting. Meaning if you don’t like the hosting at some point you are stuck when you try to point the domain somewhere else.

Static pages:
Static web pages if looked at with a text editor are based on a similar basic template to this one:
‹html›
‹head›
‹⁄head›
‹body›
‹⁄body›
‹⁄html›
These are pages that don’t change at all or are very likely to need minor edits in the future. Typical examples might be " about_us.html "

Pages like these are generally put together by a web editor or developer using information supplied by a client and loaded up to the web hosting often via a Cpanel by a web developer. Static pages are described static because their only function is just display images and text when viewed by a web browser

For a user like a small business once static pages ate set up ; you will want to be able to add more content but you might not want to keep paying a web developer and you have enough on your plate ,creating civvies and chasing bad debts and don’t want another task of successfully logging in to and maneuvering through a Paella to add more content.

This is where the beauty of blogs come in . You don’t have to access the web hosting area directly if a web site has been coded to have a simple admin vetting and a blog submission page. WordPress has such a system and for certain web set ups its OK.

However it can still be confusing for people who don’t really want to mess about with dashboards and their settings.This is why we coded a simple system. The way it works is that you go to your domain just using your browser and click on login link. From there you get a blog submission link taking you to a page that has a couple of text boxes and an image browse button. Because the system is stripped down there is no spellcheck editor.

Thats no problem just work on your post using a word processor , do a spell check and correct mistakes then save it as article.txt .That will get rid of hidden text formatting. Then just open with text edititor copy and paste into text box.

The system I’m talking about is exactly what I''m using at this web and exactly how I submitted this blog .Web surfers can then engage with you via comments to which you can reply. Again its all done through a simple system. Comments are taken but are not visible until you read them and make them visible again through a simple user interface.

How does this work & how does it different from static web pages? Well the basics are the same in that a blog page has a similar format. The difference is that the content you type into the text boxes does not go straight into the page. Instead its a actually stored in a database and the web is coded to retrieve that content and dynamically put it into the page.

Sitemap.xml

In theory as your website grows and the pages increase, then the pages over time would be acknowledged and flag up on searches done by surfers on Google. However the idea of a sort of index where the pages that your web site currently have are listed was implemented in 2005.

Put simply sitemap.xml is a text file containing a list of all your web site pages to make it easy for Search Engines to catalogue your pages so that they turn up in searches. Pages are listed in the form of the URL address that needs to be in the address bar of a web browser to see the page. For instance an entry in sitemap.xml for the contact us page of ginbrookes would be:

‹url›
  ‹loc›https://www.ginbrookes.co.uk/contact_us‹/loc›
  ‹lastmod›2018-02-25‹/lastmod›
  ‹priority›0.80‹/priority›
‹/url›

You can manually create a sitemap.xml but its easy to make some minor mistake of some sort and with Google being picky its easier to use a third party to create it free for you:www.xml-sitemaps.com

The sitemap.xml is put at the hosting root of the web site and then web masters tell Google about it via Webmaster tools.

robots.txt

Now over time as indexing got more savy, " Internet Bots " arrived that basically crawled the web producing data of web sites and their pages, which were used like a digital Yellow Pages for people searching online. Trouble was who or what was going to moderate how far bots probed.

This is where the idea of a robots.txt came along. Its also a text file, again placed at the root of your web site hosting. In the file you mute what you want to give access to bots to crawl and what you don' t. I say mute or suggest your preferences since its unlikely that all bots are going to respect it. It does have an influencing affect for sure as I found out when I was unable to get xml-sitemaps to produce a sitemap.xml file for me. Basically I had inadvertently blocked their scan due to a poorly written robots,txt file. I fixed this when I added allow sitemap.xml to Googlebot, a few other legitimate bots and in the final " every other bot "

        
User-agent: *
Allow: /sitemap.xml
Disallow: /index.php
Disallow: /api
Disallow: /lib

Comments:


Add Comment