How does mobiki support SEO?

If somebody wants to publish content on the web, most often he would like to have a big as possible audience. This can be done by linking to a mobiki site from many other sites. But most often one would like to be found by search engines, like Google, DuckDuckGo, ixquick or others. If you are not found, it is like you do not exist.

But fortunately you can do some Search Engine Optimization (SEO) of you website and it's content. For that mobiki provides the following features — and new features will be added, especially if they do not blow up the code too much and are easily added:


After each page change mobiki builds a new static sitemap.xml file in the software's base directory. This file is essential for search engines, because it provides a list of all pages on your mobiki website - easy going for Google. As an example, a website with just two pages would be referenced like this:

<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="">


A website's HTML content can be enriched by additional meta information. If you think about an article for example, there could be additional information like it's title, author, publishing date and other information. All this information is already visible at any mobiki page, but how could a search engine know about it, if it is just somewhere hidden in the rest of the text?

But now the HTML is enriched with some extra tags to be more specific about this data. Mobiki provides the following extra information for every page (article) out of the box:

  • The articles name or title
  • The publishing date
  • The author, as given in config.php
  • The language of the website, as given in config.php

Actually mobiki does provide microdata following the Article schema of

International language support

Actually mobiki only supports one language for all the website's content. The main and only language can be set in the config file:

define('WIKI_LANGUAGE',		'en');

The setting will be reflected at the beginning of the generated HTML code:

<html lang="en">

For shure you can make another parallel mobiki installation with another language setting and set manual cross links. But actually the is no automatik mechanism for interlinking.


To stop robots from crawling useless or dynamically created pages like the login page or recent changes page, which is indeed pretty useless, a webmaster can provide a robots.txt file in the domain's base directory with instructions or better recommendations for search robots and web crawlers to make their lifes easier. The default robots.txt looks like this:

# mobiki general robots.txt file for all search engines
User-agent: *
Disallow: /index.php?action=recent
Disallow: /index.php?page=*&action=login
Disallow: /feed.xml

If you like, you can change it to get specific pages or parts of you website out of a search engine's site index.
Back to FAQ