DN42 is a wonderful project that enables you to develop your skills without scraping the BGP product environment, without you having to have expensive devices with which to make your lab to do simulations with GNS3. At the same time is not a pure laboratory environment where there is no real world problems. Participate with 1 node in the project for about a year. One of the problems in the project is 1:1 with the real world – When someone announced prefixes that do not need to declare. Because I'm lazy and don't feel like typing on hand time filters, I thought the problem with a simple bash script that generates a prefix-list with name dn42 and pour the valid prefixes.

#!/bin/bash</pre>
vtysh -c 'conf t' -c "no ip prefix-list dn42"; #drop old prefix list

while read pl
do
vtysh -c 'conf t' -c "$pl"; #insert prefix list row by row
done < <(curl -s https://ca.dn42.us/reg/filter.txt | grep -e ^[0-9] | awk '{ print "ip prefix-list dn42 seq " $1 " " $2 " " $3 " ge " $4 " le " $5}' | sed "s_/\([0-9]\+\) ge \1_/\1_g;s_/\([0-9]\+\) le \1_/\1_g");
vtysh -c 'wr' #write new prefix list

The list of valid prediksi into https://ca.dn42.us/reg/filter.txt from the main conveyor + little mods on my part to be able to generate a prefix list. The commands are executed in vtysh.

邮件客户端

Идеятаеидентичнакактовпостами VACUUM Firefox databases and REINDEX. for some time mozilla debian dumped ребрадинраните versions of products. in migration of icedowe to thunderbird thinking, i haven't дефрагментирал base, so far, through email client leaked serious amount of letters, email accounts and servers, users and passwords. The script is identical to the one from my previous post with only slight modification for where to look for the files 🙂

Linux version

for db in $(find ~/.thunderbird/$(grep Path ~/.thunderbird/profiles.ini | cut -d'=' -f2) -maxdepth 1 -name "*.sqlite" -type f)
do
echo "VACUUM ${db}"
sqlite3 ${db} VACUUM
sqlite3 ${db} REINDEX
done

Mac os version

for db in $(find ~/Library/Thunderbird/$(grep Path ~/Library/Thunderbird/profiles.ini | cut -d'=' -f2)  -maxdepth 1  -name "*.sqlite" -type f)
do
echo "VACUUM && REINDEX ${db}"
sqlite3 "${db}" VACUUM;
sqlite3 "${db}" REINDEX;
done

Unlike the Firefox profile folder of the Thunderbird is a much more correct way (without space) and it is not necessary to change the delimiter.

A cunning trick to improve the speed of your browser like firefox's sqlite data base vakumirate. VACUUM process rebuilds the database thus it defragment, decreased size and it makes searching faster, disk usage more efficient and possibly decreased load on your disk as read in the sequence of my sectors. In general this applies to each database that does not perform automatic vacuum/defrag to yourself.

The mere process of defragmenting the database of Firefox is pretty trivial – We perform sqlite3 db-file VACUUM in the profile directory of firefox. In the linux path is usually ~/.mozilla/firefox/random-name.default. In General, you should have only 1 dirketoriâ in mozilla/firefox. If you have more you can check ~/.mozilla/firefox/profiles.ini for the correct directory is your browser's profile. Since I'm not engaged to perform the process manually raspish basic script which Defragmenter database:

for db in $(find ~/.mozilla/firefox/$(grep Path ~/.mozilla/firefox/profiles.ini | cut -d'=' -f2) -maxdepth 1 -name "*.sqlite" -type f)
do
   echo "VACUUM ${db}"
   sqlite3 ${db} VACUUM
   sqlite3 ${db} REINDEX
done

After the script runs the search in history-it is self-charging lisik is sensitive faster. The script I think with small modifications you can use in Mac OS.

p.Here's the version for Mac OS X – tested on Mac OS X Sierra. To forename delimiter to be new order, because of the space in the path to the folder which contains the firefox profile

OIFS="$IFS"
IFS=$'\n'
for db in $(find ~/Library/Application\ Support/Firefox/$(grep Path ~/Library/Application\ Support/Firefox/profiles.ini | cut -d'=' -f2)  -maxdepth 1  -name "*.sqlite" -type f)
do
echo "VACUUM && REINDEX ${db}"
sqlite3 "${db}" VACUUM;
sqlite3 "${db}" REINDEX;
done

Certification in ipv6.he.net there are daily tests that give 1 the extra point after passed the basic tests. You should do 100 such tests for maximum results 😐 . The tests themselves were completely trivial

  • Traceroute
  • YOU AAAA
  • DIG PTR
  • Ping
  • Whois

The most unpleasant that the tests themselves must be unique, so cannot use a single domain dayti 🙂 among other things, and a bit annoying 🙄 – any call, just pluses 5 command in the cli is and copy/paste in their web site.

As a lazy admin who likes to make life easier madrasah quickly a mess to beat that, to do the dirty work instead of me

#!/bin/bash

hr() {
  local start=$'\e(0' end=$'\e(B' line='qqqqqqqqqqqqqqqq'
  local cols=${COLUMNS:-$(tput cols)}
  while ((${#line} < cols)); do line+="$line"; done
  printf '%s%s%s\n' "$start" "${line:0:cols}" "$end"
}

if [ -z $1 ]
then
  echo "Append domain afert the script name!!!"
  exit
fi

IP=$(dig $1 AAAA +short)

if [ -z ${IP} ]
then
  echo "$1 dont have valid IPv6 record"
else
  reset
  traceroute6 $1
  hr
  dig $1 AAAA
  hr
  dig -x ${IP}
  hr
  ping6 -c3 ${IP}
  hr 
  whois ${IP}
fi

As you can see, the script is insanely elementary. Poduval domain name, what is it to check if there are IPv6 records and if so, to conduct daily tests for him. Top chic part – function hr which prints a line across the width of the screen, taken from bash-hackers.

One of the things that annoys me most is when in the cli to copy/move large directory, so I don't know what percentage of the whole size I pretrial. Unfortunately cp/mv have no such forces and the necessary resort to alternative options. There are quite a few options, but for me personally the most I like to use rsync instead of pc/mv. It has everything built-in – the protection of the rights to files and directories, progress bar, like the ability to delete copied files.

In General, I made myself 2 alias-and who do more than marvelous job:

alias cpi='rsync -a --info=progress2'
alias mvi='rsync -a --info=progress2 --remove-source-files'