Mediawiki-File mass deletion

From Biowikifarm Metawiki
Revision as of 23:49, 18 June 2012 by Gregor Hagedorn (Talk | contribs) (Created page with "Deleting a large number of files from a mediawiki based media repository can become a problem. One convenient method is to use the "Nuke" mass deletion extension (also insta...")

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Deleting a large number of files from a mediawiki based media repository can become a problem.

One convenient method is to use the "Nuke" mass deletion extension (also installed on WMF/Wikipedia sites). This extension can delete by user name or IP. If the files to delete span more than the current setting of recent changes (on the biowikifarm usually 90 days) it is possible to simply change that setting to an arbitrary higher number (300 days, 600 days...) and run rebuildrecentchanges.php.

However, the Nuke method has some limitations: It contains two bugs in that for a large number of images a) only a more limited amount of files is deleted than shown (perhaps 5000 selected, only 500-1000 deleted?) b) running Nuke repeatedly will always relist the previously deleted files first, so that repeated runs are no solution.

Combining a query method (e.g. Mass Deletion/Nuke) with deleteBatch.php

Our recent solution was to use the Mass deletion special page (= Nuke extension) to create a list of Files to be deleted (setting the limit of files to be listed to 10000), copy that list into a texteditor for cleanup and reduce to just the "File:Name.jpg" part, then put that file on the server and run deleteBatch.php as in:

cd /var/www/v-species/o; 
php ./maintenance/deleteBatch.php  --conf ./LocalSettings.php -r "preparing for reupload" ./aaa_batchdelete.txt