Turkeyhomes.com sells extremely high quality real estate in Turkey to a global audience. This company’s target audience includes clients from the United States, United Kingdom, Scandinavian and other European countries; Azerbaijan, Iran, Iraq, Kuwait, Saudi Arabia, Lebanon, and the United Arab Emirates. The properties for sale are located in all the major Turkish cities.
The team was limited by budget and had to consider the fact that content changes to the website were also limited due to CMS restrictions. So, we decided to start with an audit to understand what can be improved under the project limitations.
We considered the project details and developed a two-stage working plan that included consult, audit, and technical optimization.
More than 3,000 queries on the topic according to the client’s priorities were selected. We focused on target locations and the main types of properties for sale instead of working on long tail queries.
As a result, basic key phrases like “Turkey + apartments,” “Izmir + house,” and “Istanbul + villa” were created. The team members also researched the key phrases used by the Turkeyhomes.com competitors and also employed them during further stages of this project.
All the key phrases were checked in Google Keyword Planner, and we picked all the relevant queries.
Some queries were obtained in Google Analytics, which we linked with Google Search Console to ensure that Google Analytics collected data on clicks and impressions correctly.
The technical specialist also used specialized services such as Ahrefs and SemRush to find keywords that the competitors targeted.
The collected keywords were analyzed, the most appropriate were selected and distributed among the landing pages. This helped to create the site structure that represented the needs of customers.
The main principle we had used was the following: a single group of queries should address a single demand of a customer.
The general structure of the landing pages looked like:
A part of this structure looked like:
This is an example of a landing page for a ‘penthouse Bodrum‘ keyword:
We analyzed the obtained landing pages and selected a priority list. For these landing pages, the team generated unique Titles, Meta-Descriptions, and H1 headers:
It was important to include the keywords in the Titles while the Titles themselves had to remain short or else they could reduce the keyword’s weight.
Meta-Descriptions should also contain the keywords. The main point to keep in mind was to ensure that every Meta-Description looked appealing to the users since it was displayed in snippets and should encourage people to click on the link.
Titles and Meta-Descriptions were generated based on the popular accompanying keywords for the niche and unique variables for every page. As a result, we created an algorithm of generation keywords for a landing page for developers. The algorithm took into account locations in Turkey, types of property, and tags for property objects.
This is a part of the template with URLs, page titles, and Meta-Descriptions:
We developed an algorithm on how to compose technical tasks for our client’s copywriter, who edited the existing content and included the contextual interlink. In general, the algorithm looks like this:
1. Search for the pages on top of the search results.
2. Analyze what type of content is high-ranked and whether articles, product descriptions, longreads, or something else is popular among the target audience.
3. Study the frequency of the keywords in the texts and analyze whether there are any patterns. Either way, the content should be human-friendly and bring value to the readers.
4. Research Latent Semantic Indexing (LSI) keywords that are usually used to describe a specific topic. Google supposes that LSIs should be present at the text relevant to a particular topic. By using special services, we can identify these words.
Sitemap.xml contained URLs with http version of the website while the website was on https. It also contained technical pages, such as filters, currency exchange webpages, private cabinet pages, and search pages that contained few pieces of unique content. These pages did not collect traffic and Google considered those given pages as duplicate content.
The specialists also redirected http pages to https. By doing so, we got rid of duplicates in the index and from low value pages. After removing http from the indexation, the website reached a slight increase in positions.
The page load speed was low, so we prepared recommendations on how to increase this. Now we can see some pages have increased load times within 5 sec instead of the previous 21 sec. However, more improvements are needed to be implemented, regarding the fact that load speeds are recommended to be within 3 sec.
We prepared the extended company’s brand awareness within search: added contacts, completed a Google My Business profile, and set up snippets. This is how a snippet looked after the applied improvements were made:
a. Setting up breadcrumbs that indicate the current page’s location within a site’s navigational hierarchy.
b. Removing the broken links from the site.
Broken links are internal links on your site that lead to non-existent pages. When you have a large number of these type of links, you can receive a decrease in search results positions. We checked all the pages of the site and all of its links using special software. A list of the broken links were transferred to the developers and content managers with a detailed description of the problem.
c. Removing links that did not lead directly to the web pages and used redirects instead.
Google recommends to use direct links instead of links that lead to redirected pages. In addition, if there are many non-direct links, this negatively affects the download speed of the site. We identified redirects the same way as broken links.
d. Optimizing Alt tags for images.
You can add an Alt tag to all images. This helps Google range these images according to the respective queries. We identified a number of keywords for every web page where it made sense to set up Alt tags.
e. Implementing Structured Data Product and Real estate agent.
Structured Data allowed to mark the data so that Google would index it. It is possible to mark different things such as recipes, goods, product catalogs, logo, addresses / phones, etc. By marking out the product page using schema.org/Product, we helped search engines understand where was name, price, and other attributes. Real estate agent was used to tag days, times, price, and more.
The team started the project in early September and completed the consulting stage in early October.
Implementation of our recommendations took some time, we constantly consulted the client’s developers. A significant part of the improvements were implemented by the end of November. Then, re-indexing of pages took place, and we began receiving first results in December.
When we got started, the search traffic to the site was growing:
The total volume of traffic was growing, but this growth was provided by informational traffic to the blog pages that had lower conversion rates as compared with product and catalog pages. Meanwhile, traffic on the pages with services was decreasing when we started working on the project. In December, it started to slightly increase:
As a result, traffic from the conversion pages increased by 60%. In general, the site traffic grew by 113%.
This is how the dynamics of the client’s positions looked like: