Design, as we explained earlier, plays a major role in making a great website. A design which is simply visually appealing, however, doesn’t necessarily guarantee a good rank in a search engine. The internet is peppered with beautiful websites which do poorly on search engines. Websites need to adhere to certain guidelines to ensure great optimization for search engines.
SEO is about making your pages rank higher on search results. But there are certain pages on your website which you don’t want users to arrive at from search results. The robots.txt file is used to exclude such pages from showing up in search results.
Search engines use bots or robots to crawl websites and learn about them, so that they’ll know which websites should show up for a particular keyword. Whenever such bots arrive at a website, the first thing they look for is the robots.txt file, because it contains instructions from the website’s owner. Now, there are good bots and bad bots. The bad ones especially, like the malware bots that are on the lookout for security vulnerabilities, pay no attention to the robots.txt file.
What is the role of robots.txt?
It contains two important information. Which bots are allowed to crawl this website, and which pages on this site should not be crawled.
How to create a robots.txt?
It can be created using any text-editor. The name of this file is case-sensitive, so it should be lower-case only. The robots.txt file should be put in the root folder of your website, along with the index or welcome page, so that the path of this file always is, www.yourdomainname.com/robots.txt .
It usually has two commands. User-agent is to specify the bot to which the following instructions apply. Disallow specifies the pages which are restricted.
A simple example of a robots.txt file is as below.
So, in the example above, the “*” beside User-agent says, the following commands apply to every kind of bot that lands on this site.
The “/” beside Disallow, represents all sub-directories in the root folder are restricted to bots. That means, no page inside the root folder should be crawled, by any bot.
Here are a few examples. To permit select bots, and keep the rest out,
To restrict select directories on a website from being crawled, the commands would be,
To block files of a specific type,
To block a particular directory, and everything in it,
Alternate method – META TAG:
You can also include a robots <meta> in the header of every page on your site. The syntax is,
<META NAME=”ROBOTS” CONTENT=”NOINDEX, FOLLOW”>
<META NAME=”ROBOTS” CONTENT=”INDEX, NOFOLLOW”>
The FOLLOW / NOFOLLOW attribute is for the links on that page. If it is NOFOLLOW, then bots should not be following any links on that page. If no meta tag is included, then it implies INDEX and FOLLOW, so there is no need to explicitly mention this.
As with Zoho Sites, you can access the crawler specification in the SEO settings page.
The topics discussed here are very simple changes to your webpages, which when combined, have a notable impact on how your website is treated by search engines.
Here is what matters most, and why.
It all starts with this, even though it is not exposed to users. It is the term that people search for, in order to arrive at a website. You first need to identify the keywords which are both, relevant to what you offer, and are likely to bring you traffic. Keywords of a page are different from keywords of another page, because each page has its own focus terms. Identifying keywords is the most important task of them all, because the title, description and URL of that page depend on, and contain the keywords.
The more specific your keywords are, the better. For example,if you own a bakery in Seattle, and specialize in muffins, then Muffins in Seattle is a better option than Cakes.
Users are more likely to click on a search result that contains the term they just searched for. That is why the title and description draw attention if when contain these keywords.
The title of your page is enclosed in <title> tags, and is exposed as marked in the image above. There are some guidelines an optimum title would always follow.
1. Uniqueness – No two pages of your website should have the same title.
2. Readability – It should be a meaningful sentence, long enough (preferably 65 characters), and is not spammy.
3. Keyword – The title should contain the keyword which that page focuses on. The sooner it appears in the title, the better.
Description appears below the title on search results, and should support the claim you’ve made in your title.
1. Preferably 160 characters long.
2. Should also contain the keyword focused on that page.
3. Should be unique, informative and interesting
4. Should sum up the content on the entire page.
Easily understandable URL’s have two main advantages. They make your website easily navigable to users, and simplifies managing your pages. The characteristics of a good URL are,
1. few, easy words, no numbers
2. Not too long
3. Use specific keywords like home.html, contact.html, etc. instead of generic ones like page1.html, page2.html, etc.
It is best to have only one H1 heading tag per page, which should contain the same keyword used in title and description tags.
SEO with Zoho Sites:
All these are enclosed in meta tags like <title>, <description>, etc. If you are managing your own HTML files on Zoho Sites, you are probably familiar with these tags. Otherwise, if you are using the drag-and-drop builder, things are even easier. Zoho Sites lets you enter meta details for every page on your site.
In the Settings page, you have SEO options.
In addition to these meta information, there is another essential optimization technique, but that happens automatically on Zoho Sites; the sitemap. For every website published, Zoho Sites automatically generates the sitemap.xml file. A sitemap helps search engines to identify the entire structure of your website, so that no page goes unnoticed. It also helps in identifying the navigation path to every page.
All the above parameters focus only on the text on a webpage. There are numerous other techniques to drive more traffic into your website, which we will gradually get into. Try optimizing your web pages by filling these parameters, and see how it affects your site’s ranking on search engines.
Hitting the top spot in a search engine results page will, to say the least, bring good things for your site. High-quality, relevant, constantly-updated content improves your websites’ worth in the eyes of the search engines. And most times, the people who have the best, relevant, fresh content for your site are…. gasp! your customers. These days, crowd-sourcing some of your content to your community is a no-brainer. It is a win-win situation where you serve your customers by providing them a forum to get answers and to connect with other users. And at the same time you also serve the marketing purpose of getting the content and activity that you need for SEO.
It is then of little surprise that a whole bunch of forum software vendors mushroomed to serve this demand.
However, traditional forums software requires additional modifications or plugins to create meaningful search engine optimized sites. And, additional software always means additional maintenance. It takes a whole new level of expertise to customize them to maintain visual and login synch with the rest of your website. At the other end of the spectrum, are some of the latest web 2.0 communities that are either pretty expensive or require you to use their user base, domain, branding and thereby boosting their search engine ranking – NOT yours.
Alternatives? Of course! Zoho Discussions is the perfect you-can-have-your-cake-and-eat-it-too type of solution. When you host your customer support communities in Zoho Discussions, you automatically get:
- Search Engine-friendly descriptive URLS – no additional configuration / plug-ins to install.
- Topic title is setup meaningfully as a page title, therefore improving relevancy
- Starting even from the Basic plan, you can configure multiple domains for your forums area. Zoho Discussions will automatically channel all these domains to a single chosen “primary domain” so that your users retain context while you also get good search engine mileage.
- Embeddable forums – you can also embed an entire portal inside your website – either to simplify visual integration or to provide a contextual widgets with recent discussions.
Of course, search engine juice is not the only area of focus while setting up an online forum. Search engines only solve one part of the problem – helping users reach your website. Once the users are brought in, they need to see a brand consistent and intuitive discussions area. We already wrote about the powerful, server side script enabled, visual customization and single sign on options that Zoho Discussions provides.
But that is not all. Improving end-user experience is built right into the DNA of Zoho Discussions. Features like:
- Rich Topic Types and workflows, instead of just generic posts.
- Intuitive Editor that supports multi-media embedding – so that screenshot can be embedded right in context.
- User labels – award passionate customers with “expert” tags. Give them a recognition for the time and contribution they bring in.
- Also have powerful moderation features that stop spamming right at the door
Help your users participate better and generate meaningful content that helps you back! Go ahead, sign up and create your online customer support community. You can try it out risk free for a month and should you have any questions, or require help in setting up, feel free to contact us or, of course, post in our forums (powered by Zoho Discussions!)
Zoho Discussions also has tools and integration options that help you measure and monitor the effectiveness of user traction. That deserves a separate post of its own!