In these pages we show the comparison of MPM with the latest version available (at the time of writing) in Debian of 4 different state-of-the-art packages managers. Our goal was to assess the improvements in the quality of the solution that are attainable using our modular architecture by reusing solvers which participated in the Mancoosi International Solver Competition (MISC~2011). This document is a companion to the tests performed during the EDOS project.
In the Figure above we show the aggregate results of our tests where solutions are ranked according to the "paranoid" criterion (see misc 2011 criteria for details) and then divided into three categories, "best solution", "sub-optimal" and "failure". The first two categories hold respectively the best solutions (including ex-aequos) and not optimal, but still correct solutions. The failure category aggregates all results that were either not a solution or were the result of a timeout or a crash of the package manager.
When looking for a solution in a universe containing only the 'sarge' baseline, all solvers find a solution in most cases, and this solution is the best one in roughly 75% of the cases, with the notable exception of 'smart' which is almost as good as MPM. The situation changes radically as soon as more than one baseline is considered: all legacy package manager fail to find a solution in more than 40% of the cases, and their hit rate drops considerably when adding more baselines. This is consistent with the folklore experience of GNU/Linux distribution users who know that maintaining an "old" machine using these tools becomes more difficult over time. On the other side, MPM is remarkably stable, and consistently finds the best solution, no matter what the composition of the universe is. This is not a surprise, as MPM relies on a state-of-the-art external solver, that is required to find a "global" solution that is optimal w.r.t. the paranoid criterion, unlike the legacy package managers which apply ad-hoc local search algorithms. These results also justify the fact of not comparing execution times: when the maintenance problem become difficult, MPM is the only viable tool, even in its prototype, unoptimized form.
We performed five groups of tests, using the same installation / removal requests with a combination of different Debian releases - or baselines - ('sarge', 'etch', 'lenny', 'squeeze' and 'sid'). The initial 'status' for these experiments was selected as a set of installed packages on a Debian server running 'sarge'.
The package managers selected for these tests were apt-get (v.0.9.7), aptitude (v.0.6.7), smart (v.1.4), and cupt (v.2.5.6). The first two package managers are the most representative, as they are the standard ones on Debian; the last two were selected because available at the time of writing in the Debian distribution and at the same time capable of working with Debian meta-data. All package managers were used with their default options.
All these legacy solvers have hardwired some kind of optimization criterion, so it is not easy to compare them in a fair manner; since the algorithms of these tools all work locally (they try to satisfy the user request by only looking at the dependencies directly related to the request), we decided to compare their solutions w.r.t. the 'paranoid' optimization function, which privileges conservative solutions.
A problem instance for our test is composed of three parts: an installation status (a set of packages which are already installed on the machine that the users wants to modify), a universe of packages available for installation, and a user request (which packages to install or remove).
In all our problem set, the installation status is the same, and is the image of the installation of a standard server running Debian sarge. We used five different universes of packages, obtained by progressively adding together, in chronological order, all the packages available in the last five Debian baselines: 'sarge', 'etch', 'lenny', 'squeeze' and 'sid'. The smallest universe, with only 'sarge', contains 15.000 packages, while the larges universe, with the union of all baselines, adds up to 60.000 packages.
For each of these five universes, we have run 162 user requests, half of them requiring the installation of 5 packages and half requiring the removal of 5 packages. Package managers were given a timeout of 60 seconds for each request. Increasing the timeout to 300s did not change the results.
Since performance was not a goal of this study, our simulation environment was established in a clean Debian chroot on a commodity x86 machine.
MPM is still a prototye and it is not yet packaged. MPM is available from the mancoosi svn server.
The execution enviroment for our test can be downloaded here.
sources.list entries used for the above tests (included in the execution