Manual:Arquitectura de MediaWiki
Desde el principio, MediaWiki fue desarrollado específicamente para ser el software de Wikipedia. Los desarrolladores han trabajado para facilitar la reutilización por terceros, pero la influencia y los sesgos de Wikipedia han moldeado la arquitectura de MediaWiki a lo largo de su historia.
Wikipedia es uno de los diez sitios web más populares del mundo, recibiendo en la actualidad unos 400 millones de usuarios únicos cada mes. Obtiene más de 100 000 visitas por segundo. Wikipedia no está sostenida por la publicidad, sino en su totalidad por una organización sin ánimo de lucro, la Fundación Wikimedia, que depende de las donaciones como principal modelo de financiación. Esto significa que MediaWiki no solo debe administrar uno de los diez principales sitios web, sino que además debe hacerlo con un presupuesto restringido. Para satisfacer estas demandas, MediaWiki prioriza en gran medida el rendimiento, el almacenamiento en caché y la optimización. Aquellas funcionalidades que no se pueden habilitar en Wikipedia se revierten o se deshabilitan mediante una variable de configuración; existe un eterno compromiso etre el rendimiento y la cantidad de funcionalidades.
La influencia de Wikipedia en la arquitectura de MediaWiki no está limitada al rendimiento. Al contrario que los CMS genéricos, MediaWiki se escribió desde su origen para un propósito muy concreto: apoyar una comunidad que crea y mantiene conocimiento libremente reutilizable en una plataforma abierta. Esto significa, por ejemplo, que MediaWiki no incluye funcionalidades habituales en los CMS comerciales (como un flujo de publicación sencillo o las listas de control de accesos (ACL)), pero ofrece una variedad de herramientas para hacer frente al spam y al vandalismo.
Así que, desde el principio, las necesidades y acciones de una comunidad en constante evolución de participantes de Wikipedia han influido en el desarrollo de MediaWiki y viceversa. La arquitectura de MediaWiki ha sido impulsado muchas veces por iniciativas emprendidas o solicitadas por la comunidad, como la creación de Wikimedia Commons o la funcionalidad de revisiones marcadas (Flagged Revisions). Los desarrolladores han hecho cambios de peso en la arquitectura, como el preprocesador de MediaWiki 1.12, ya que la manera en que los wikipedistas utilizaban MediaWiki los hizo necesarios.
MediaWiki también ha obtenido una sólida base de usuarios externos al ser software de código abierto desde el principio. Los reutilizadores terceros saben que, siempre que un sitio web de tan alto perfil como Wikipedia utilice MediaWiki, el software será mantenido y mejorado. En el pasado, MediaWiki estaba muy centrado en los sitios de Wikimedia, pero se han hecho esfuerzos para hacerlo más genérico y adaptarse mejor a las necesidades de estos usuarios terceros. Por ejemplo, MediaWiki incluye un excelente instalador basado en la web, lo que hace que el proceso de instalación sea mucho menos doloroso que cuando todo se tenía que hacer a través de la línea de comandos, y el software contenía rutas fijas para Wikipedia.
Aun así, MediaWiki es y sigue siendo el software de Wikipedia, y esto se muestra a lo largo de su historia y de su arquitectura.
Base de código y prácticas de MediaWiki
Capa de usuario | Navegador web | ||
---|---|---|---|
Capa de red | Varnish | ||
Servidor web Apache | |||
Capa lógica | Scripts PHP de MediaWiki | ||
PHP | |||
Capa de datos | Sistema de archivos | Base de datos MySQL (programa y contenido) | Sistema de almacenamiento en caché |
PHP
PHP fue escogido como entorno para la «Fase II» del software de Wikipedia en 2001; MediaWiki ha crecido orgánicamente desde entonces, y sigue evolucionando a día de hoy. La mayoría de desarrolladores de MediaWiki son voluntarios que colaboran en su tiempo, y eran muy pocos en los primeros años. Algunas decisiones u omisiones en el diseño del software pueden parecer un error en retrospectiva, pero es difícil criticar a los fundadores por no haber implementado cierta abstracción que hoy se considera crítica, siendo tan pequeña la base de código inicial y tan corto el tiempo que llevó su desarrollo.
Por ejemplo, MediaWiki utiliza nombres de clases sin prefijo, lo que puede causar conflictos cuando el núcleo de PHP y los desarrolladores de PECL añaden nuevas clases.
Como consecuencia de ello, la clase Namespace
de MediaWiki se tuvo que renombrar a MWNamespace
para ser compatible con PHP 5.3.
Usar un prefijo de forma consistente en todas las clases (por ejemplo, "MW
") habría facilitado la integración de MediaWiki en otra aplicación o biblioteca.
Depender de PHP probablemente no fue la mejor opción de cara al rendimiento, ya que no se ha beneficiado de las mejoras que otros lenguajes dinámicos han experimentado. El uso de Java habría sido mucho mejor para el rendimiento, y habría simplificado el escalado en la ejecución de tareas de mantenimiento del servidor. Por otra parte, PHP es muy popular, lo que facilita la incorporación de nuevos desarrolladores.
Aunque MediaWiki aún contiene código heredado «feo», se han hecho mejoras importantes con los años, y se han introducido nuevos elementos a su arquitectura a lo largo de la historia.
Algunos de ellos son las clases Parser
, SpecialPage
y Database
, la clase Image
y la jerarquía de la clase FileRepo
, ResourceLoader
y la jerarquía Action
.
MediaWiki empezó sin ninguno de estos elementos, pero todos ellos se ocupan de funcionalidades que han estado allí desde el principio.
Muchos desarrolladores están interesados principalmente en el desarrollo de funcionalidades, y la arquitectura se suele dejar de lado sólo para retomarse más tarde, cuando el coste de trabajar en una arquitectura inadecuada se vuelve evidente.
Debido a que MediaWiki es la plataforma de sitios de alto perfil como Wikipedia, los desarrolladores del núcleo y revisores de código han implantado reglas estrictas de seguridad.
Para que sea más fácil escribir código seguro, MediaWiki ofrece a los desarrolladores capas de envoltura destinadas a la salida HTML y las consultas a la base de datos para gestionar el escape de caracteres.
Para depurar la entrada del usuario, se emplea la clase WebRequest
, que analiza los datos que se pasan por la URL o a través de un formulario por POST.
Retira las barras invertidas de «comillas mágicas», elimina los caracteres de entrada ilegales y normaliza las secuencias Unicode.
Se evitan los ataques de falsificación de peticiones entre sitios (CSRF) mediante el uso de tokens, y los de scripting entre sitios (XSS) validando las entradas y escapando las salidas, habitualmente mediante la función htmlspecialchars()
de PHP.
MediaWiki también proporciona (y usa) un saneador HTML junto con la clase Sanitizer
, así como funciones de bases de datos que impiden los ataques de inyección SQL.
Configuración
MediaWiki ofrece cientos de parámetros de configuración que se almacenan en variables globales de PHP.
Su valor predeterminado está definido en DefaultSettings.php
, y el administrador del sistema puede redefinirlos editando LocalSettings.php
.
Anteriormente, MediaWiki dependía en exceso de las variables globales, incluso para la configuración y el procesamiento de contextos.
Las variables globales tienen graves implicaciones de serguridad con la función register_globals
de PHP (que no le hace falta a MediaWiki desde la versión 1.2).
Este sistema también limita las abstracciones potenciales para la configuración y dificulta la optimización del proceso de arranque.
Además, el espacio de nombres de configuración está compartido con las variables utilizadas para el registro y el contexto de los objetos, lo que puede dar lugar a conflictos.
Desde el punto de vista del usuario, las variables de configuración globales también han hecho que MediaWiki parezca difícil de configurar y mantener.
El desarrollo de MediaWiki ha sido la historia de un lento pero progresivo desplazamiento de contexto de las variables globales a los objetos.
Almacenar el contexto del procesamiento en variables miembros de objetos permite la reutilización de estos objetos de una forma mucho más flexible.
Almacenamiento de la base de datos y del texto
MediaWiki utiliza un servidor de base de datos relacional desde el software de la Fase II. El SGBD predeterminado (y con mejor soporte) para MediaWiki es MySQL, que es el que utilizan todos los sitios de Wikimedia, aunque otros SGBD (como PostgreSQL, Oracle y SQLite) disponen de implementaciones mantenidas por la comunidad. Un administrador de sistemas puede escoger un SGBD durante la instalación de MediaWiki, y MediaWiki proporciona tanto una capa de abstracción para la base de datos como una capa de abstracción para las consultas que simplifica el acceso a la base de datos por parte de los desarrolladores.
La disposición actual contiene decenas de tablas.
Muchas de ellas conciernen el contenido del wiki (como page
, revision
, category
y recentchanges
).
Otras tablas contienen datos de usuarios (user
, user_groups
), archivos multimedia (image
, filearchive
), gestión de la caché (objectcache
, l10n_cache
, querycache
) y herramientas internas (job
para la cola de trabajo), entre otros.
En MediaWiki hay un uso extensivo de índices y de tablas de resumen, ya que las consultas SQL que recorren un gran número de filas pueden ser muy costosas, particularmente en los sitios de Wikimedia.
Generalmente no se recomienda emplear consultas no indizadas.
La base de datos ha pasado por decenas de cambios de esquema a lo largo de los años, siendo el más notable el desacoplamiento del almacenamiento de texto y el seguimiento de revisiones en MediaWiki 1.5.
En el modelo 1.4, el contenido se almacenaba en dos tablas importantes, cur
(que contenía el texto y los metadatos de la revisión actual de la página) y old
(que contenía revisiones anteriores); las páginas borradas se almacenaban en archive
.
Cuando se editaba una página, la revisión que hasta entonces había sido la última se copiaba a la tabla old
, mientras que la nueva edición se guardaba en cur
.
Cuando se renombraba una página, el título debía actualizarse en los metadatos de todas las revisiones de old
, lo cual podía llevar mucho tiempo.
Cuando se borraba una página, sus entradas tanto en la tabla cur
como en la old
debían copiarse a la tabla archive
antes de ser borradas; esto suponía trasladar el texto de todas las revisiones, lo que podía ser un número muy grande y por tanto llevar tiempo.
En el modelo 1.5, se desacoplaron los metadatos y el texto de las revisiones: las tablas cur
y old
fueron reemplazadas por page
(metadatos de las páginas), revision
(metadatos de todas las revisiones, antiguas o actuales) y text
(texto de todas las revisiones, antiguas, actuales o borradas).
Now, when an edit is made, revision metadata doesn't need to be copied around tables: inserting a new entry and updating the page_latest
pointer is enough.
Also, the revision metadata doesn't include the page title anymore, only its ID: this removes the need for renaming all revisions when a page is renamed.
The revision
table stores metadata for each revision, but not their text; instead, they contain a text ID pointing to the text
table, which contains the actual text.
When a page is deleted, the text of all revisions of the page stays there and doesn't need to be moved to another table.
The text
table is composed of a mapping of IDs to text blobs; a flags field indicates if the text blob is gzipped (for space savings) or if the text blob is only a pointer to an external text storage.
Wikimedia sites use a MySQL-backed external storage cluster with blobs of a few dozen revisions.
The first revision of the blob is stored in full, and following revisions to the same page are stored as diffs relative to the previous revision; the blobs are then gzipped.
Because the revisions are grouped per page, they tend to be similar, so the diffs are relatively small and gzip works well.
The compression ratio achieved on Wikimedia sites nears 98%.
MediaWiki also has built-in support for load balancing, added as early as 2004 in MediaWiki 1.2 (when Wikipedia got its second server — a big deal at the time). The load balancer (MediaWiki's PHP code that decides which server to connect to) is now a critical part of Wikimedia's infrastructure, which explains its influence on some algorithm decisions in the code. The system administrator can specify in MediaWiki's configuration that there is one master database server, and any number of slave database servers; a weight can be assigned to each server. The load balancer will send all writes to the master, and will balance reads according to the weights. It also keeps track of the replication lag of each slave. If a slave's replication lag exceeds 30 seconds, it will not receive any read queries to allow it to catch up; if all slaves are lagged more than 30 seconds, MediaWiki will automatically put itself in read-only mode.
MediaWiki's "chronology protector" ensures that replication lag never causes a user to see a page that claims an action they've just performed hasn't happened yet. This is done by storing the master's position in the user's session if a request they made resulted in a write query. The next time the user makes a read request, the load balancer reads this position from the session, and tries to select a slave that has caught up to that replication position to serve the request. If none is available, it will wait until one is. It may appear to other users as though the action hasn't happened yet, but the chronology remains consistent for each user.
Requests, caching and delivery
Execution workflow of a web request
index.php
is the main access point for MediaWiki, and handles most requests processed by the application servers (i.e. requests that were not served by the caching infrastructure; see below).
The code executed from index.php
performs security checks, loads default configuration settings from includes/DefaultSettings.php
, guesses configuration with includes/Setup.php
and then applies site settings contained in LocalSettings.php
.
It then instantiates a MediaWiki
object ($mediawiki
), and creates a Title
object ($wgTitle
) depending on the title and action parameters from the request.
index.php
can take a variety of action parameters in the URL request; the default action is view
, which shows the regular view of an article's content.
For example, the request https://en.wikipedia.org/w/index.php?title=Apple&action=view
displays the content of the article "Apple" on the English Wikipedia[1].
Other frequent actions include edit
(to open an article for editing), submit
(to preview or save an article), history
(to show an article's history) and watch
(to add an article to the user's watchlist).
Administrative actions include delete
(to delete an article) and protect
(to prevent edits to an article).
MediaWiki::performRequest()
is then called to handle most of the URL request.
It checks for bad titles, read restrictions, local interwiki redirects, and redirect loops, and determines whether the request is for a normal or a special page.
Normal page requests are handed over to MediaWiki::initializeArticle()
, to create an Article
object for the page ($wgArticle
), and then to MediaWiki::performAction()
, which handles "standard" actions.
Once the action has been completed, MediaWiki::doPostOutputShutdown()
finalises the request by committing DB transactions, outputting the HTML and launching deferred updates through the job queue.
MediaWiki::restInPeace()
commits the deferred updates and closes the task gracefully.
If the page requested is a Special page (i.e., not a regular wiki content page, but a special software-related page such as Statistics
), SpecialPageFactory::executePath
is called instead of initializeArticle()
; the corresponding PHP script is then called.
Special pages can do all sorts of magical things, and each has a specific purpose, usually independent of any one article or its content.
Special pages include various kinds of reports (recent changes, logs, uncategorised pages) and wiki administration tools (user blocks, user rights changes), among others.
Their execution workflow depends on their function.
Many functions contain profiling code, which makes it possible to follow the execution workflow for debugging, if profiling is enabled.
Profiling is done by calling the wfProfileIn
and wfProfileOut
functions to respectively start and stop profiling a function; both functions take the function's name as a parameter.
On Wikimedia sites, profiling is done for a percentage of all requests, to preserve performance.
MediaWiki sends UDP packets to a central server that collects them and produces profiling data.
Assembly of a non-cached page
When viewing a page, HTML code may be taken from the cache (see below); if not, first the templates, parser functions and variables are expanded. This gives the expanded wikitext, an intermediate result which can be seen with Special:ExpandTemplates, and depends on:
- the wikitext;
- the templates directly or indirectly referred to;
- the Parser functions directly or indirectly referred to;
- the values of variables directly or indirectly referred to.
Next, this expanded wikitext is converted to HTML code; it is sent to the user, and contains references to CSS, JavaScript, and image files. The user can see this intermediate result by applying the "view source" option of the browser. The HTML code for a given page depends on:
- the expanded wikitext;
- the mode, such as viewing or editing (see below);
- the existence of internally linked pages (gives view or edit link);
- the skin and other user preferences;
- the user's name;
- the status of the user (more links if a sysop, etc.);
- the namespace (determines the link to the Talk page, or in the case of a Talk page, the page concerned);
- whether the page is watched by the user (gives watch or unwatch link);
- whether the user's Talk page has been recently edited (gives a message).
Finally, the browser renders the HTML, using the files it refers to. The result the user sees on the screen depends on:
- the HTML code;
- files referred to by the HTML code, such as embedded images, server-side CSS files, and JavaScript files;
- the browser and browser settings, including possibly a local CSS file, and the screen resolution.
If JavaScript is responding to an event such as a mouse click, the page on the screen depends also on these events. This applies, for example, in the case of a sortable table.
When the user selects the edit tab, the wikitext itself is sent to them, of the whole page or of one section only. When the user presses Show preview, their new version of the wikitext is sent to the server, which sends the corresponding new version of the HTML code, which is rendered again and displayed above or below the user's new version of the wikitext (which the server has also returned). After possibly more changes and more previews, the user presses Save page, sending the user's "final" version to the server, which now records the edit and sends the HTML of the new version (again). In some cases an automatic conversion of wikitext also takes place in this stage.
Caching
MediaWiki itself is improved for performance because it plays a central role on Wikimedia sites, but it is also part of a larger operational ecosystem that has influenced its architecture. Wikimedia's caching infrastructure has imposed limitations in MediaWiki; developers worked around the issues, not by trying to shape Wikimedia's extensively optimised caching infrastructure around MediaWiki, but rather by making MediaWiki more flexible, so it could work within that infrastructure, without compromising on performance and caching needs.
On Wikimedia sites, most requests are handled by reverse caching proxies (Squids), and never even make it to the MediaWiki application servers. Squids contain static versions of entire rendered pages, served for simple reads to users who aren't logged in to the site. MediaWiki natively supports Squid and Varnish, and integrates with this caching layer by, for example, notifying them to purge a page from the cache when it has been changed. For logged-in users, and other requests that can't be served by Squids, Squid forwards the requests to the web server (Apache).
The second level of caching happens when MediaWiki renders and assembles the page from multiple objects, many of which can be cached to minimise future calls. Such objects include the page's interface (sidebar, menus, UI text) and the content proper, parsed from wikitext. The in-memory object cache has been available in MediaWiki since the early 1.1 version (2003), and is particularly important to avoid re-parsing long and complex pages.
Login session data can also be stored in memcached, which lets sessions work transparently on multiple front-end web servers in a load-balancing setup (Wikimedia heavily relies on load balancing, using LVS with PyBal).
Since version 1.16, MediaWiki uses a dedicated object cache for localised UI text; this was added after noticing that a large part of the objects cached in memcached consisted of UI messages localised into the user's language. The system is based on fast fetches of individual messages from constant databases (CDB), i.e. files with key-value pairs. CDBs minimise memory overhead and start-up time in the typical case; they're also used for the interwiki cache.
The last caching layer consists of the PHP opcode cache, commonly enabled to speed up PHP applications. Compilation can be a lengthy process; to avoid compiling PHP scripts into opcode every time they're invoked, a PHP accelerator can be used to store the compiled opcode and execute it directly without compilation. MediaWiki will "just work" with many accelerators such as APC, PHP accelerator.
Because of its Wikimedia bias, MediaWiki is optimised for this complete, multi-layer, distributed caching infrastructure. Nonetheless, it also natively supports alternate setups for smaller sites. For example, it offers an optional simplistic file caching system that stores the output of fully rendered pages, like Squid does. Also, MediaWiki's abstract object caching layer lets it store the cached objects in several places, including the file system, the database, or the opcode cache.
Like in many web applications, MediaWiki's interface has become more interactive and responsive over the years, mostly through the use of JavaScript. Usability efforts initiated in 2008, as well as advanced media handling (e.g. online editing of video files), called for dedicated front-end performance improvements.
To optimise the delivery of JavaScript and CSS assets, the ResourceLoader module was developed. Started in 2009, it was completed in 2011 and has been a core feature of MediaWiki since version 1.17. ResourceLoader works by loading JS and CSS assets on demand, thus reducing loading and parsing time for unused features, for example in older browsers. It also minifies the code, groups resources to save requests, and can embed images as data URIs.
Idiomas
Context and rationale
A central part of effectively contributing and disseminating free knowledge to all is to provide it in as many languages as possible. Wikipedia is available in more than 280 languages, and encyclopedia articles in English represent less than 20 % of all articles. Because Wikipedia and its sister sites exist in so many languages, it is important not only to provide the content in the readers' native language, but also to provide a localised interface, and effective input and conversion tools, so that participants can contribute content.
For this reason, localisation and internationalisation (l10n & i18n) are a central component of MediaWiki. The i18n system is pervasive, and impacts many parts of the software; it's also one of the most flexible and feature-rich. Translator convenience is usually preferred to developer convenience, but this is believed to be an acceptable cost.
MediaWiki is currently localised in more than 350 languages, including non-latin and right-to-left (RTL) languages, with varying levels of completion. The interface and content can be in different languages, and can have mixed directionality.
Idioma del contenido
MediaWiki originally used per-language encoding, which led to a lot of issues; for example, foreign scripts could not be used in page titles. UTF-8 was adopted instead. Support for character sets other than UTF-8 was dropped in 2005, along with the major database schema change in MediaWiki 1.5; content must now be encoded in UTF-8.
Characters not available on the editor's keyboard can be customised and inserted via MediaWiki's Edittools
, an interface message that appears below the edit window; its JavaScript version automatically inserts the character clicked into the edit window.
The WikiEditor extension for MediaWiki, developed as part of a usability effort, merges special characters with the edit toolbar.
Another extension, called UniversalLanguageSelector , provides additional input methods and key mapping features for non-ASCII characters.
Recent and future improvements include better support for right-to-left text, bidirectional text (LTR and RTL text on the same page) and UniversalLanguageSelector .
Idioma de la interfaz
Interface messages have been stored in PHP arrays of key-values pairs since the Phase III software was created.
Each message is identified by a unique key, which is assigned different values across languages.
Keys are determined by developers, who are encouraged to use prefixes for extensions; for example, message keys for the UploadWizard extension will start with mwe-upwiz-
, where mwe
stands for MediaWiki extension.
MediaWiki messages can embed parameters provided by the software, which will often influence the grammar of the message. In order to support virtually any possible language, MediaWiki's localisation system has been improved and complexified over time to accommodate their specific traits and exceptions, often considered oddities by English speakers.
For example, adjectives are invariable words in English, but languages like French require adjective agreement with nouns.
If the user profile has gender preferences set, the {{GENDER:}}
switch can be used in interface messages to appropriately address them (more info).
Other switches include {{PLURAL:}}
, for "simple" plurals and languages like Arabic with dual, trial or paucal numbers, and {{GRAMMAR:}}
, providing grammatical transformation functions for languages like Finnish whose grammatical cases cause alterations or inflections.
The gender distinction can also be used in gender-dependent user namespace names, so that the title and URL of the page refers to the user correctly.
Standard MediaWiki namespaces' gender variants are defined via $namespaceGenderAliases
in each language's MessagesXx.php, while $wgExtraGenderNamespaces can be used for wiki-specific namespaces.
As of r107559, 13 languages use this feature by default:
- Arabic
- Czech
- German
- Lower Sorbian
- Spanish
- Galician
- Hebrew
- Upper Sorbian
- Polish
- Brazilian Portuguese
- Portuguese
- Russian
- Saterland Frisian
Localising messages
Localised interface messages for MediaWiki reside in MessagesXx.php
files, where Xx
is the ISO-639 code of the language (e.g. MessagesFr.php
for French); default messages are in English and stored in MessagesEn.php
.
MediaWiki extensions use a similar system, or host all localised messages in an [Extension-name].i18n.php
file.
Along with translations, Message files also include language-dependent information such as date formats.
Contributing translations used to be done by submitting PHP patches for the MessagesXx.php
files.
In December 2003, MediaWiki 1.1 introduced "database messages", a subset of wiki pages in the MediaWiki namespace containing interface messages.
The content of the wiki page MediaWiki:[Message-key]
is the message's text, and overrides its value in the PHP file.
Localised versions of the message are at MediaWiki:[Message-key]/[language-code]
, e.g. MediaWiki:Rollbacklink/de
.
This feature has allowed power users to translate (and customise) interface messages locally on their wiki, but the process doesn't update i18n files shipping with MediaWiki.
In 2006, Niklas Laxström created a special, heavily hacked MediaWiki website (now hosted at translatewiki.net
) where translators can easily localise interface messages in all languages, simply by editing a wiki page.
The MessagesXx.php
files are then updated in the MediaWiki code repository, where they can be automatically fetched by any wiki.
On Wikimedia sites, database messages are now only used for customisation, and not for localisation any more.
MediaWiki extensions and some related programs, such as bots, are also localised at translatewiki.net.
To help translators understand the context and meaning of an interface message, it is considered a good practice in MediaWiki to provide documentation for every message.
This documentation is stored is a special Message file, with the qqq
language code, which doesn't correspond to a real language.
The documentation for each message is then displayed in the translation interface on translatewiki.net.
Another helpful tool is the qqx
language code: when used with the &uselang
parameter to display a wiki page (e.g. en.wikipedia.org/wiki/Special:RecentChanges?uselang=qqx
), MediaWiki will display the message keys instead of their values in the user interface; this is very useful to identify which message to translate or change.
Registered users can set their own interface language in their preferences, in which case it overrides the site's default interface language.
MediaWiki also supports fallback languages: if a message isn't available in the chosen language, it will be displayed in the closest possible language, and not necessarily in English. For example, the fallback language for Breton is French.
Usuarios
Users are represented in the code using instances from the User
class, which encapsulates all of the user-specific settings (user id, name, rights, password, email address, etc.).
Client classes use accessors to access these fields; they do all the work of determining whether the user is logged in, and whether the requested option can be satisfied from cookies or whether a database query is needed.
Most of the settings needed for rendering normal pages are set in the cookie to minimise use of the database.
MediaWiki provides a very granular permissions system, with basically a user permission for every possible action. For example, to perform the "Rollback" action (i.e. to "quickly rollback the edits of the last user who edited a particular page"), a user needs the rollback
permission, included by default in MediaWiki's sysop
user group.
But it can also be added to other user groups, or have a dedicated user group only providing this permission (this is the case on the English Wikipedia, with the Rollbackers
group). Customisation of user rights is done by editing the $wgGroupPermissions
array in LocalSettings.php
; for instance, $wgGroupPermissions['user']['movefile'] = true;
allows all registered users to rename files.
A user can belong to several groups, and inherits the highest rights associated with each of them.
However, MediaWiki's user permissions system was really designed with Wikipedia in mind, i.e. a site whose content is accessible to all, and only certain actions are restricted to some users. MediaWiki lacks a unified, pervasive permissions concept; it doesn't provide traditional CMS features like restricting read or write access by namespace, category, etc. A few MediaWiki extensions provide such features to some extent.
Contenido
Content structure
The concept of namespaces was used in the UseModWiki era of Wikipedia, where talk pages were at the title "[article name]/Talk".
Namespaces were formally introduced in Magnus Manske's first "PHP script".
They were reimplemented a few times over the years, but have kept the same function: to separate different kinds of content.
They consist of a prefix, separated from the page title by a colon (e.g. Talk:
or File:
and Template:
); the main content namespace has no prefix.
Wikipedia users quickly adopted them, and they provided the community with different spaces to evolve.
Namespaces have proven to be an important feature of MediaWiki, as they create the necessary preconditions for a wiki's community and set up meta-level discussions, community processes, portals, user profiles, etc.
The default configuration for MediaWiki's main content namespace is to be flat (no subpages), because it's how Wikipedia works, but it is trivial to enable them.
They are enabled in other namespaces (e.g. User:
, where people can for instance work on draft articles) and display breadcrumbs.
Namespaces separate content by type; within a same namespace, pages can be organised by topic using categories, a pseudo-hierarchical organisation scheme introduced in MediaWiki 1.3.
Content processing: MediaWiki markup language & Parser
The user-generated content stored by MediaWiki isn't in HTML, but in a markup language specific to MediaWiki, sometimes called "wikitext". It allows users to make formatting changes (e.g. bold, italic using quotes), add links (using square brackets), include templates, insert context-dependent content (like a date or signature), and make an incredible number of other magical things happen.
To display a page, this content needs to be parsed, assembled from all the external or dynamic pieces it calls, and converted to proper HTML. The parser is one of the most essential parts of MediaWiki, which also makes it difficult to change or improve. Because hundreds of millions of wiki pages worldwide depend on the parser to continue outputting HTML the way it always has, it has to remain extremely stable.
The markup language wasn't formally spec'd from the beginning; it started based on UseModWiki's markup, then morphed and evolved as needs have demanded. For example, the usage of a ThreadMode format for discussions made Magnus Manske implement the 3 or 4 tildes (~~~~) as a shortcut to sign one's posts in unstructured text. Tildes were chosen as it resembled his father's hand-written signature.[2]
In the absence of a formal specification, the MediaWiki markup language has become a complex and idiosyncratic language, basically only compatible with MediaWiki's parser; it can't be represented as a formal grammar using BNF, EBNF or ANTLR syntaxes. The current parser's specification is jokingly referred to as "whatever the parser spits out from wikitext, plus a few hundred test cases".
There have been many attempts at alternative parsers, but none has succeeded so far. In 2004, an experimental tokeniser was written by Jens Frank to parse wikitext, and enabled on Wikipedia; it had to be disabled three days later, because of the poor performance of PHP array memory allocations. Since then, most of the parsing has been done with a huge pile of regular expressions, and a ton of helper functions. The wiki markup, and all the special cases the parser needs to support, have also become considerably more complex, making future attempts even more difficult.
A notable improvement was Tim Starling's preprocessor rewrite in MediaWiki 1.12, whose main motivation was to improve the parsing performance on pages with complex templates.
The preprocessor converts wikitext to an XML DOM tree representing parts of the document (template invocations, parser functions, tag hooks, section headings, and a few other structures), but can skip "dead branches" in template expansion, such as unfollowed #switch
cases and unused defaults for template arguments.
The parser then iterates through the DOM structure and converts its content to HTML.
Recent work on a visual editor for MediaWiki has made it necessary to improve the parsing process (and make it faster), so work has resumed on the parser and intermediate layers between MediaWiki markup and final HTML (see Future, below).
Magic words and templates
MediaWiki offers "Magic words" that modify the general behaviour of the page or include dynamic content into it.
They consist of: behaviour switches like __NOTOC__
(to hide the automatic table of content) or __NOINDEX__
(to tell search engines not to index the page); variables like {{CURRENTTIME}}
or {{SITENAME}}
; and parser functions, i.e. magic words that can take parameters, like {{lc:[string]}}
(to output [string]
in lowercase).
Constructs like {{GENDER:}}
, {{PLURAL:}}
and {{GRAMMAR:}}
, used to localise the UI, are parser functions.
The most common way to include content from other pages in a MediaWiki page is to use templates. Templates were really intended to be used to include the same content on different pages, e.g. navigation panels or maintenance banners on Wikipedia articles; having the ability to create partial page layouts and reuse them in thousands of articles with central maintenance made a huge impact on sites like Wikipedia.
However, templates have also been used (and abused) by users for a completely different purpose. MediaWiki 1.3 made it possible for templates to take parameters that change their output; the ability to add a default parameter (introduced in MediaWiki 1.6) enabled the construction of a functional programming language implemented on top of PHP, which was ultimately one of the most costly features in terms of performance.
Tim Starling then developed additional parser functions (the ParserFunctions extension), as a stopgap measure against insane constructs created by Wikipedia users with templates.
This set of functions included logical structures like #if
and #switch
, and other functions like #expr
(to evaluate mathematical expressions) and #time
(for time formatting).
Soon enough, Wikipedia users started to create even more complex templates using the new functions, which considerably degraded the parsing performance on template-heavy pages. The new preprocessor introduced in MediaWiki 1.12 (a major architectural change) was implemented to partly remedy this issue. Later, MediaWiki developers discussed the possibility of using an actual scripting language to improve performance. Extensión:Scribunto was added in February of 2013.
Media files
Users upload files through the Special:Upload
page; administrators can configure the allowed file types through an extension whitelist.
Once uploaded, files are stored in a folder on the file system, and thumbnails in a dedicated thumb
directory.
Because of Wikimedia's educational mission, MediaWiki supports file types that may be uncommon in other web applications or CMSes, like SVG vector images, and multipage PDFs & DjVus. They are rendered as PNG files, and can be thumbnailed and displayed inline, as are more common image files like GIFs, JPGs and PNGs.
When a file is uploaded, it is assigned a File:
page containing information entered by the uploader; this is free text, which usually includes copyright information (author, license) and items describing or classifying the content of the file (description, location, date, categories, etc.).
While private wikis may not care much about this information, on media libraries like Wikimedia Commons they are critical to organise the collection and ensure the legality of sharing these files.
It has been argued that most of these metadata should, in fact, be stored in a queryable structure like a database table.
This would considerably facilitate search, but also attribution and reuse by third parties — for example, through the API.
Most Wikimedia sites also allow "local" uploads to each wiki, but the community tries to store freely-licensed media files in Wikimedia's free media library, Wikimedia Commons. Any Wikimedia site can display a file hosted on Commons as if it were hosted locally. This custom avoids having to upload a file to every wiki to use it there.
As a consequence, MediaWiki natively supports foreign media repositories, i.e., the ability to access media files hosted on another wiki through its API and the ForeignAPIRepo
system.
Since version 1.16, any MediaWiki website can easily use files from Wikimedia Commons through the InstantCommons
feature.
When using a foreign repository, thumbnails are stored locally to save bandwidth.
However, it is not (yet) possible to upload to a foreign media repository from another wiki.
Customising and extending MediaWiki
Niveles
MediaWiki's architecture provides different ways to customise and extend the software. This can be done at different levels of access:
- System administrators can install extensions and skins, and configure the wiki's separate helper programs (e.g. for image thumbnailing and TeX rendering) and global settings (see Configuration above).
- Wiki sysops (sometimes called "administrators" too) can edit site-wide gadgets, JavaScript and CSS settings.
- Any registered user can customise their own experience and interface using their preferences (for existing settings, skins and gadgets) or make their own modifications (using their personal JS and CSS pages). External programs can also communicate with MediaWiki through its machine API, if it's enabled, basically making any feature and data accessible to the user.
JavaScript and CSS
MediaWiki can read and apply site-wide or skin-wide JavaScript and CSS using custom wiki pages; these pages are in the MediaWiki:
namespace, and thus can only be edited by sysops; for example, JavaScript modifications from MediaWiki:Common.js
apply to all skins, CSS from MediaWiki:Common.css
applies to all skins, but MediaWiki:Vector.css
only applies to users with the Vector skin.
Users can do the same types of changes, which will only apply to their own interface, by editing subpages of their user page (e.g. User:[Username]/common.js
for JavaScript on all skins, User:[Username]/common.css
for CSS on all skins, or User:[Username]/vector.css
for CSS modifications that only apply to the Vector skin).
If the Gadgets extension is installed, sysops can also edit gadgets, i.e. snippets of JavaScript code providing features that can be turned on and off by users in their preferences. Upcoming developments on gadgets will make it possible to share gadgets across wikis, thus avoiding duplication.
This set of tools has had a huge impact and greatly increased the democratisation of MediaWiki's software development. Individual users are empowered to add features for themselves; power users can share them with others, both informally and through globally-configurable sysop-controlled systems. This framework is ideal for small, self-contained modifications, and presents a lower barrier of entry than heavier code modifications done through hooks and extensions.
Extensions and skins
When JavaScript and CSS modifications are not enough, MediaWiki provides a system of hooks that let third-party developers run custom PHP code before, after, or instead of MediaWiki code for particular events. MediaWiki extensions use hooks to plug into the code.
Before hooks existed in MediaWiki, adding custom PHP code meant modifying the core code, which was neither easy nor recommended. The first hooks were proposed and added in 2004 by Evan Prodromou; many more have been added over the years when needed. Using hooks, it is even possible to extend MediaWiki's wiki markup with additional capabilities, using tag extensions.
The extension system isn't perfect: extension registration is based on code execution at startup, rather than cacheable data, which limits abstraction and optimisation and hurts MediaWiki's performance. But overall, the extension architecture is now a fairly flexible infrastructure that has helped make specialised code more modular, keeping the core software from expanding (too) much, and making it easier for third-party users to build custom functionality on top of MediaWiki.
Conversely, it's very difficult to write a new skin for MediaWiki without reinventing the wheel.
In MediaWiki, skins are PHP classes each extending the parent Skin
class; they contain functions that gather the information needed to generate the HTML.
The long-lived "MonoBook" skin was difficult to customise because it contained a lot of browser-specific CSS to support old browsers; editing the template or CSS required many subsequent changes to reflect the change for all browsers and platforms.
The other main access point for MediaWiki, besides index.php
, is api.php
, used to access its machine-readable query API (Application Programming Interface).
Wikipedia users originally created "bots" that worked by screen scraping the HTML content served by MediaWiki; this method was very unreliable and broke many times.
To improve this situation, developers introduced a read-only interface (located at query.php
), which then evolved into a full-fledged read and write machine API providing direct, high-level access to the data contained in the MediaWiki database.
Client programs can use the API to login, get data, and post changes. The API supports thin web-based JavaScript clients and end-user applications. Almost anything that can be done via the web interface can basically be done through the API. Client libraries implementing the MediaWiki API are available in many languages, including Python and .NET.
Layers, domains, and patterns
MediaWiki can be divided into around 12 technical layers , with each layer calling classes and code in the layer beneath it but not above it. Examples include the installer layer , entry point layer , wiring layer , and API layer . Code spanning all the layers can be grouped into around 21 domain modules , with examples including the navigation domain (skins), user management domain (create, rename, login), and internationalisation domain . Many software design patterns are used in MediaWiki, including the factory pattern , handler pattern , and command pattern .
Futuro
What started as a summer project done by a single volunteer PHP developer has grown into MediaWiki, a mature, stable wiki engine powering a top-ten website with a ridiculously small operational infrastructure. This has been made possible by constant optimisation for performance, iterative architectural changes and a team of awesome developers.
The evolution of web technologies, and the growth of Wikipedia, call for ongoing improvements and new features, some of which require major changes to MediaWiki's architecture. This is, for example, the case for the ongoing visual editor project, which has prompted renewed work on the parser and on the wiki markup language, the DOM and final HTML conversion.
MediaWiki is a tool that is used for varied purposes. Within Wikimedia projects, for instance, it's used to create and curate an encyclopedia (Wikipedia), to power a huge media library (Wikimedia Commons) or to transcribe scanned reference texts (Wikisource); and so on. In other contexts, MediaWiki is used as a corporate CMS, or as a data repository, sometimes combined with a semantic framework. These specialised uses that weren't planned for will probably continue to drive constant adjustments to the software's internal structure. As such, MediaWiki's architecture is very much alive, just like the immense community of users it supports.
Notes and references
- ↑ View requests are usually prettified with URL rewriting, in this example to
w:Apple
. - ↑ https://twitter.com/MagnusManske/status/1083507467802365952
Further reading
- MediaWiki documentation and support: https://www.mediawiki.org
- Automatically-generated MediaWiki documentation: https://doc.wikimedia.org
- Domas Mituzas, Wikipedia: site internals, configuration, code examples and management issues, MySQL Users conference, 2007. Full text available at http://dom.as/talks/
- Faidon Liambotis, The Wikimedia infrastructure, dotScale 2014. (YouTube video)