pages tagged sysadmin http://meng6net.localhost/tag/sysadmin/ <p><small>Copyright © 2005-2020 by <code>Meng Lu &lt;lumeng3@gmail.com&gt;</code></small></p> Meng Lu's home page ikiwiki Sun, 25 Oct 2020 00:50:16 +0000 Warning about low disck space http://meng6net.localhost/computing/sysadmin/automatically_warning_about_low_disk_space/ http://meng6net.localhost/computing/sysadmin/automatically_warning_about_low_disk_space/ computing note software sysadmin Sun, 25 Oct 2020 00:50:16 +0000 2020-10-25T00:50:16Z <p>For whatever reason, even if my Macbook Pro computer booted with more than 10 GB disk space available, after working on it and having it powered on constantly for a day or two, its disk space very often will fall dramatically and sometimes even almost fill up with nominally something like 200 MB left. And it often causes issues:</p> <ul> <li> <p>Having to "manually" kill applications that are no longer responsive using Activity Monitor.</p> </li> <li> <p>Some applications not properly exiting and saving status and metadata. For example, Eclipse workspace data, backup files in Emacs, and currently opened tabs in Google Chrome.</p> </li> </ul> <p>I made a <a href= "https://github.com/lumeng/repogit-mengapps/blob/master/file_system/warn-about-low-disk-space.sh"> Bash script</a> to check and warn about low disk space and have it run periodically as a cron job.</p> 中国古代书籍分类 http://meng6net.localhost/journal/%E4%B8%AD%E5%9B%BD%E5%8F%A4%E4%BB%A3%E4%B9%A6%E7%B1%8D%E5%88%86%E7%B1%BB/ http://meng6net.localhost/journal/%E4%B8%AD%E5%9B%BD%E5%8F%A4%E4%BB%A3%E4%B9%A6%E7%B1%8D%E5%88%86%E7%B1%BB/ TODO academics business information taxonomy journal library library science reference scholar sysadmin Fri, 08 May 2020 20:51:23 +0000 2020-05-08T20:51:23Z <h2>中国古代书籍分类</h2> <ul> <li> <p>中国古代书籍可以定义为1912年之前出版印行及1912年之后影印等形式的同类电子图书。</p> </li> <li> <p>经部</p> <ul> <li>易、书、诗、三礼礼制、乐、春秋、四书、孝经、小学、群经总义、谶纬、丛编、石经等</li> </ul> </li> <li>史部 <ul> <li>正史別史、编年、纪事本末、杂史、载记地理、史表、史评史抄、传记、政书职官、诏令奏议、金石、目录、丛编等</li> </ul> </li> <li>子部 <ul> <li>儒家、道家、兵家、法家、农家、医家、杂家、小说家、天文历算、术数、艺术工艺、丛编等</li> </ul> </li> <li>集部 <ul> <li>楚辞、别集、总集、诗文评、词 、曲、小说、丛编等</li> </ul> </li> <li>丛部 <ul> <li>类书通类、类书专类、杂纂丛书、辑佚丛书、郡邑丛书、氏族丛书、独撰丛书、丛书补缺等</li> </ul> </li> </ul> <h3>TODO</h3> <ul> <li>补充各类书籍实例。</li> </ul> Backup MediaWiki http://meng6net.localhost/computing/sysadmin/backup_mediawiki/ http://meng6net.localhost/computing/sysadmin/backup_mediawiki/ backup mediawiki note sysadmin Tue, 16 May 2017 23:59:39 +0000 2017-05-16T23:59:39Z <p>Here is a memo on backing up MediaWiki instances, say deployed as a part of a Web site <code>mywebsite.com</code>.</p> <p>Here is a listing of concrete steps:</p> <p>Get inside the backup root directory on local file system:</p> <pre><code>cd /Volumes/BACKUP/mywebsite.com </code></pre> <h2>Backup using the <code>backup_mediawiki.sh</code> backup script</h2> <ol> <li>Login web server</li> <li>Update VCS repository <code>https://github.com/lumeng/MediaWiki_Backup</code></li> <li>Back up using <code>MediaWiki_Backup/backup_mediawiki.sh</code> <pre> # assuming web directory is ~/mywebsite.com/wiki WIKI_PATH="mywebsite.com/wiki" # assuming the path to save a subdirectory backup_YYYYMMDD created by backup is path/to/backup/mywebsite.com/wiki WIKI_BACKUP_PATH="path/to/backup/mywebsite.com/wiki" # get to the home path before start cd # Start backup. This will create backup path/to/backup/mywebsite.com/wiki/backup_YYYYMMDD. path/to/backup_mediawiki.sh -d $WIKI_BACKUP_PATH -w $WIKI_PATH </pre></li> <li>Rsync the backup to a local hard drive: <pre> cd /Volumes/BACKUP/mywebsite.com </pre> <h1>Backup the whole web site user's home directory that includes the backup files created above, using rsync</h1> <pre> <br />rsync --exclude-from rsync_backup_exclusion.txt -thrivpbl user@webhost.com:/home/websiteuser rsync_backup/ </pre></li> <li>Ideally, upload the backup to cloud storage such as Drobpox.</li> </ol> <h2>HTML backup using <code>wget</code> for immediate read</h2> <p>Optionally, one can also keep a crawled version of a MediaWiki instances. Sometimes, it can be useful to have a copy of HTML files for immediate read offline.</p> <pre><code>cd /Volumes/BACKUP/mywebsite.com/wget_backup mkdir mywebsite.com-wiki__wget_backup_YYYYMMDD cd mywebsite.com-wiki__wget_backup_YYYYMMDD # crawl the whole Web site # wget -k -p -r -E http://www.mywebsite.com/ # crawl the pages of the MediaWiki instance excluding the Help and Special pages wget -k -p -r --user-agent='Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2049.0 Safari/537.36' -R '*Special*' -R '*Help*' -E http://www.mywebsite.com/wiki/ cd .. 7z -a -mx=9 mywebsite.com-wiki__wget_backup_YYYYMMDD.7z wget_backup_YYYYMMDD </code></pre> <p>Remarks:</p> <ul> <li><code>-k</code>: convert links to suit local viewing</li> <li><code>-p</code>: download page requisites/dependencies</li> <li><code>-r</code>: download recursively</li> <li><code>--user-agent</code>: set "fake" user agent for the purpose of emulating regular browsing as sometimes site checks user agent. Check user agent string at <a href= "http://www.useragentstring.com/pages/Chrome/">useragentstring.com</a>.</li> </ul> <p>As for time cost to create the wget-crawled backup, for reference, it took about 30 min to download a small MediaWiki installation with hundreds of user-created pages in an experiment I did.</p> <p>If there is a small set of pages that you need to backup, <code>curl</code> may be alternatively used, for example,</p> <pre><code># download multiple pages curl -O http://mywebsite.com/wiki/Foo_Bar[01-10] </code></pre> <h2>References</h2> <ul> <li>https://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki</li> <li>https://www.mediawiki.org/wiki/Fullsitebackup</li> <li>https://www.mediawiki.org/wiki/Manual:DumpBackup.php</li> <li>https://wikitech.wikimedia.org/wiki/Category:Dumps</li> </ul> Back up MediaWiki http://meng6net.localhost/blog/backup_mediawiki/ http://meng6net.localhost/blog/backup_mediawiki/ blog computing mediawiki sysadmin Tue, 16 May 2017 23:59:39 +0000 2017-05-16T23:59:39Z <p>Wrote a <a href= "http://meng6.net/pages/computing/sysadmin/backup_mediawiki/">note</a> about backing up MediaWiki including its database (MySQL or SQLite), content pages exported as an XML file, all images, and the entire directory where MediaWiki installed, which includes <code>LocalSettings.php</code> and extensions that usually contain customization.</p> /blog/backup_mediawiki/#comments