5 SEO Tips for Attorneys
There are millions of articles and regularly new advice on SEO. That also makes sense, because search engine optimization is an important and complex topic for many professions, like attorneys. Law firms in particular can benefit greatly from increasured exposure to potential clients, given the high value of each one.
While it is certainly preferable to work with a professional agency that provides SEO for lawyers, you can still try your hand at same basics yourself. With just a few steps you can often get a lot more out of your own website.
The starting point of every successful SEO is to define the right keywords. Keywords are the terms under which Internet users most often search for information, images, products or other things from your subject area.
Finding the right keywords is not complicated. It already helps to put yourself in the shoes of other users. How you can proceed exactly is described in this article on the subject of keyword research.
The following does not apply to SEO: The more keywords I use, the better. It’s more important to use the right keywords naturally – as if you weren’t “talking” to Google, but to a real person.
Note for advanced users: This article deals exclusively with onpage optimization, i.e. measures with which you can directly ensure on your website that your page is better ranked by Google.
Step 1: Page Title (meta title)
You can see the page title of a website at the top of the browser window. In addition, your visitors (or you) will see the page title first on search engines: The title corresponds to the blue heading that leads directly to your website by clicking.
The page title has two parts:
- The General Page Title: This title is the main title of your website for all subpages.
- The individual page title for subpages, which displays the theme of the subpage in Google before the general title of the website.
Here an appropriate name fits like “About us” or “Picture gallery”. If you do not determine the individual page title yourself, the name of the respective subpage will be displayed.
Step 2: Page Description
The page description should directly address potential visitors. This is because the description is displayed to users at Google & Co. as text under the page title and is intended to convince users to click on your website. It is therefore not “technically” important for search engines to find your site, but informs users about your content.
Good to know: If you don’t define them yourself, most search engines will use any arbitrary text passages from the website – and they probably aren’t very meaningful. Remember to do white hat SEO, so don’t just stuff keywords into your descriptions (or anywhere, really).
The page description should contain a maximum of 155 to 160 characters. It is also good if you include the most important concepts of the website in it. Google will often display them in bold type in search results – and that can help get more visitors to your site.
Here’s how you do it: Insert a short page description for each subpage of your website. It should get to the heart of what it’s all about and, if possible, contain the most important keywords. More about what to look out for when writing your page description can be found in this article.
Step 3: Headings (H1, H2, H3)
Now we come to the secret SEO king discipline! Your headlines are not just text in larger font. They divide your website into different sections. Thus the headings help the search engines to recognize your content structure – just like we as readers of a newspaper want to know fast where we find which contents.
The abbreviations H1, H2 and H3 are of course not visible to visitors on your website. However, Google recognizes them in the HTML code of your page. This gives Google information about the structure of your website: Where is which content?
The H1 is your main heading and therefore refers to the entire text on the page. The H2 and H3 headings refer to the subordinate paragraphs – but of course they are no less important.
This is how you proceed: Give each of your subpages one (and really only one) H1 heading. It should describe clearly and simply what your website is about. Integrate, if possible, the most important keywords – but also make sure that the wording does not become unnatural.
Step 4: Alt Image Tags
Alternative picture texts express with words what can be seen on your pictures. Normally they are not visible for your visitors, but they decode the image content for search engines.
Use the central search terms also as “alternative image text” on your website. In the example you would write “Black Men’s Suit”. You can find many more tips in our article about image optimization for design and SEO.
This is how you proceed: For each picture on your website, click on the symbol caption/alternative text in your CMS (I recomment WordPress) and describe it with a few words as accurately and comprehensibly as possible. For shop products, the product name corresponds to the old text.
Step 5: Content
Admittedly: Simply optimizing all content for search engines is not done in five minutes. Nevertheless: The content of your website is the most important factor in onpage optimization.
If you take the time to research the right content for your website and incorporate keywords, you’ll soon notice that you’re attracting significantly more visitors via Google & Co.
What should you watch out for? The most important point has already been mentioned above: Always remember that you write the texts for people and not for the search engines. As soon as you integrate texts with exciting content and important information into your website, you will also attract visitors and attract the attention of search engines.
This is how you proceed: Take some time to create good and meaningful content for your website. It may be tempting to focus more on the look and feel, but the text is even more important than that. They should answer the most important questions about your product or service and describe clearly what your website is about.
Also make sure you present different content on different subpages and adjust your SEO settings accordingly. But the most important thing is: Use a natural language and don’t fill the texts with keywords by hook or by crook – it should still be fun for your visitors to read them.
Be brave: you don’t have to reinvent the wheel! Sometimes search engines seem like a black box where you don’t know what’s going on. So it’s good to know that you can fall back on numerous tricks that others have already tried before you and found good.
Do you have any questions about our tips or would you like to tell us about your own experiences?
We are looking forward to your comments!
What is White-Hat SEO?
White-Hat-SEO refers to search engine optimization processes that respect search engine guidelines. White-Hat-SEO refers to procedures within Denver search engine optimization (SEO) that follow the guidelines of search engine operators such as Google, Yahoo! or Bing and therefore do not violate them. The term White-Hat-SEO originally comes from English and is composed of the terms “White-Hat” and “SEO”, which means as much as search engine optimization.
This term was probably chosen in the style of a classic scene in old Western movies: Two cowboys face each other and are about to duel. The evil cowboy wears a black hat and the good cowboy a white hat. Therefore the good natured activities in the context of search engine optimization are called White-Hat-SEO.
The content of White-Hat-SEO
Basically, search engine optimization aims to make your website rank better in organic search results. And while Black Hat SEO is an exaggeration of these efforts and is characterised by the use of intensive spam practices such as the use of doorway pages, Russian links, cloaking or redirections, White Hat SEO distances itself from these spam practices and fully complies with the guidelines of the search engines. According to a Denver SEO company in Colorado, it is imperative today to focus on a clean methodology and strategies.
In general, white-hat SEO is primarily used when website operators want to make their online presence searchable for their users via search engines in the long run. This is because the search engine guidelines can be easily adhered to through natural link formation via high-quality content. However, the room for manoeuvre beyond onpage optimization is rather limited here. This is why many online marketers with their SEO activities are often in the area of so-called Grey-Hat-SEO. This means that although they do not violate guidelines through possible spam practices, they do try to improve their ranking position through targeted links to external websites, for example. This is also not welcome by search engines, but is still tolerated rather than activities within the Black Hat SEO.
Ways to get natural links
In order for a method to be natural, make sure not to overdo them and maintain a high quality standard on everything you put online that has your link attached.
Forums are perfect for you and you should definitely sign up in a few thematically related forums. If your blog is about dogs you will surely find forums for dog owners, animal lovers or about similar topics. Here you will not only find an ideal audience, but you can also collect ideas for future articles on your blog. Refresh your knowledge, take part in discussions and take notes of frequently asked questions from other members.
To bring them to your blog you have to lure them with information. Whenever someone asks a question that is answered on your blog, you can make them aware of it. If you leave a link, you will not only generate new visitors, but also create a thematically relevant link. This will help you to achieve better rankings in Google & Co. But make sure that you follow the forum rules and do not spam. Make sure that you offer everyone an added value and are not just out for your own good.
2. Question and answer portals
The Internet is perfectly suited for obtaining information free of charge. In question and answer portals people can ask questions to the community and hope afterwards for a founded and detailed answer. If you find a question here which is answered on your blog, this is a jackpot. On the one hand the questioner will be very happy, because hardly anyone will take the trouble to answer his question as detailed as your blog and on the other hand you will again provide for new visitors. Moreover, a link on such a portal is very valuable and makes your blog appear very important in the eyes of search engines.
3. Blog comments
You probably won’t run the only blog in your subject area. It makes sense to check out the competition and learn from it. This will give you new ideas for content, design and promotion. Every blog owner is happy when his texts are read and readers leave a comment. Most blogs offer you the opportunity to place a link when commenting. If you manage to arouse the interest of other readers, you have a chance that they will follow your link to your blog.
SEO Safe Site Relaunch
What is important for a website relaunch? Especially set up redirects when URLs change. True, but even if all redirects are correct, there may be a deindexing of the new page, which means nothing else than: no URLs in the index = no rankings and no traffic. This aspect is therefore also very important and should definitely be taken into account. To avoid this worst-case scenario, I have asked an SEO agency to explain three scenarios where deindexing can occur and explain how to avoid the problem.
Lock the new page with robots.txt
An introduction to the structure and the possible contents of a robots.txt can be found here as a detailed description under robots.txt – What is it and how do I use it?
A test environment (or sandbox) can be protected (alternatively or in addition to a password) against indexing by robots.txt. This is not yet desired in a test environment because, firstly, the contents are often at least partially identical to the original page (duplicate content SEO issue) and, secondly, no unfinished page should get into the index.
A possible variant to protect the test environment is the following code in robots.txt:
- User agent: *
- Disallow: //
All search engine crawlers are not allowed access to the entire domain.
Avoidance of the SEO problem
The code that excludes indexing of the test environment must be adjusted during the live page launch so that no URLs are excluded from crawling on the new Web page. In our example, the code would change to :
- User agent: *
- Allow: /
Another way to protect a test environment from indexing is to set all URLs to noindex. If a new page is launched and the non-indexing setting is not removed, the page will not be indexed and therefore cannot generate any rankings or traffic.
The noindex markup must be removed with the relaunch to be SEO safe. To check whether this is the case everywhere, you can crawl the new page and check for exactly this point. This can be done for example with the extraction function in Screaming Frog. Alternatively, there is also an extra column for Meta Robots, where this information can also be read.
Deindexation by forwarding circuits
At a relaunch you should also take the opportunity to change the site to https. Our article https-Switch in 7 steps explains what you have to consider.
In addition, you must ensure that all http URLs are forwarded to the corresponding https URLs using a forwarding rule, but that there is no other rule that returns the https URLs to the old http URLs.
And this leads to the fact that the search engines follow the redirections in the circle, find no goal and it comes to a deindexing. This is not only possible when switching to https, but also for individual URLs, i.e. when a source URL is redirected to a target URL and then back to the source URL. This issue is a biggie, so I’ve asked Who’s Talkin to explain to me what to do in such a scenario. These guys even do SEO for plastic surgeons, so they know what they are talking about.
Answer: After the relaunch, the redirects must be checked. Here a special attention should be paid to the fact that there are no forwarding chains and certainly no forwarding circles, since these can lead to a fast deindexing of the page. Redirections may only lead from the old URL to the new URL.
The topic of indexing must always be taken into account during a relaunch. The best website is worthless if it doesn’t get into the index of search engines and you don’t get any organic traffic. In order to avoid problems with indexing, the following points must be examined:
- index / noindex Meta Tags
- possible chains of communication