Categories
Internet

Choosing a rural broadband provider

I have recently moved home and I decided that after nearly eight years with PIPEX it was time to look around and see if it was possible to find a better rural broadband provider.

PIPEX provided the service that I needed for many years. But after multiple takeovers, the service and the customer support deteriorated. The low point was in the summer of 2009 when my broadband connection went up from 1.5Mbps to 3.5Mbps for six weeks. To some, this would be fantastic news, and it was to me too for a while. I told my neighbour about the great broadband service I was getting from PIPEX. He was still on 1.5Mbs so he complained to his provider (the one with the local DSLAM). Within a week of his complaint my broadband speed went down to 1Mbps and was up and down like a yo-yo thereafter. I was still paying nearly £30 a month for my unlimited use contract so I complained to PIPEX. They dropped my monthly bill to less than £10 a month but my broadband performance didn’t improve and actually got as low as just 512Kbps in 2010.

I realised that since I first signed with PIPEX there have been many BT exchanges in the UK equipped with competitor’s broadband equipment. My rural exchange was not one of these and my broadband was resold to PIPEX through BT Wholesale. PIPEX (now Opal) just didn’t have the clout in my region to demand better service from BT Wholesale so they just took my money and blamed my problems on being rural. No one could explain why I enjoyed six weeks of uninterrupted broadband at 3.5Mbps in the same house with the same equipment so I knew I was being screwed.

I did some research online and found that PlusNet had a pretty good reputation amongst rural customers. They also have some good online tools for taking a peek at the faults logged on their network including BT Wholesale’s provision. I looked at taking broadband from Sky, BT and a few others, but because I live in a rural location, my connection for the time being would be provided by BT either directly or via wholesale and none of the TV advertised deals apply. I also wouldn’t get the same level of techie fault reporting tools offered by PlusNet. It was clear to me that PlusNet where worth taking the chance on so I signed up for PlusNet voice and broadband for a year and so far I am very happy with the change.

The only hiccup I have had with PlusNet is that they assume that all new customers already have uninterrupted internet access or mobile phone access. I had neither at my new home and still don’t have any mobile phone reception. All of PlusNet’s communication with new customers waiting for connection is via email and SMS text message. Fortunately, my wife’s corporate Blackberry could receive email here so the important messages concerning installation and activation went to her email account instead. PlusNet will not send a letter to you even if you request it so they are by no means perfect, but the best I could hope for.

Don't let slow dial-up Internet get you down. Super-fast up to 20Mb broadband from only £9.99 per month. Free setup now available - terms apply. PlusNet broadband.
Categories
Ubuntu

Wireshark missing interfaces on Ubuntu 10.04

I have been using Wireshark for some time on my Vista laptop but I couldn’t get it to work on my smaller Ubuntu laptop. When an update was installed today for Wireshark on my Windows machine I decided that now was the time to fix the problem on my old Ubuntu laptop.

It was a simple fix that was documented in the wiki. All I had to do was run this command to have WireShark detect the interfaces.

sudo setcap ‘CAP_NET_RAW+eip CAP_NET_ADMIN+eip’ /usr/bin/dumpcap

Categories
Linux Ubuntu

Compiling cURL with SSL on Ubuntu 10.04

I was having problems compiling the latest source of cURL on a freshly installed Ubuntu 10.04 host. The ./configure for cURL refused to find OpenSSL despite it being installed. After reading the cURL FAQ I checked to see if I had libssl installed on my machine and I found that it wasn’t there. The library is not part of the main package but it is in the development package. I installed libssl using:-

sudo apt-get install libssl-dev

Then I rerun the ./configure for cURL with the SSL option

./configure –with-ssl

Success confirmed by the output on screen.

curl version:    7.21.0
Host setup:      i686-pc-linux-gnu
Install prefix:  /usr/local
Compiler:        gcc
SSL support:     enabled (OpenSSL)
SSH support:     no      (--with-libssh2)
zlib support:    enabled
...
Categories
FreeSwitch

FreeSWITCH Book

FreeSWITCH 1.0.6 Book CoverThere is a new book on FreeSWITCH 1.0.6 that is to be published in July 2010 by Packt Publishing. I’m looking forward to reading it.

To find out more about FreeSWITCH visit www.freeswitch.org

Categories
Hardware

Acer DX900 Smartphone

I bought an Acer DX900 Smartphone last September without actually trying one out first. That was a big mistake. The phone’s specification is impressive but it’s the dual SIM card capability that I really needed. I had a look at some Samsung Duos models and they were very good, but none of them at that time had WLAN. It wasn’t long before I started to discover how less than half baked the DX900 is, and how dissapointing Acer’s support can be. The Product Manager for this model is non existent.

After more than seven months of use, I am now ready to accept that I was exceptionally stupid to buy this phone without trying it first. I am stuck with it now for at least another year or more. Having to pull the battery out every day to restart the phone after it crashes is becoming very tiresome.

Note to self:

(1) Never knowingly buy anything made by Acer.
(2) Never buy a phone that has a Microsoft Operating System.

Categories
Internet

Secure DNS

I use various Comodo tools to protect my Windows based computers. One service offering that I noticed recently was their Secure DNS that provides an alternative to the DNS provided by my ISP. Making the change is straight forward in DHCP or resolver configuration. If you need instructions they can be found here.

The IP addresses for Comodo’s Secure DNS are:-

156.154.70.22
156.154.71.22

Other secure DNS providers include

Google Public DNS

8.8.8.8
8.8.4.4

OpenDNS

208.67.222.222
208.67.220.220

Categories
Linux MythTV

MythWeb in the DMZ

These instructions have been written specifically for installing MythWeb on an Ubuntu 9.10 host.

Preparation

Build an Apache2 web host in the DMZ and setup password login using .htaccess in the web server’s document root.

Use individual user ID’s and a group called ‘authorised-users’ to control access to the server. See htpasswd.

Configure port forwarding on your firewall to forward port 8090 aimed at the public interface to port 80 on the DMZ web server’s interface. To access the web page, point the browser at http://mythweb.dyndns.local:8090/

Test that the security works from a friend’s computer with internet access.

Installation

The default installation for MythWeb is directly on the MythTV host backend. There is no easy installation option for installing MythWeb on another host. However, it is possible to checkout MythWeb individually from SVN and install manually which is the approach I am taking.

Install Subversion if not already installed.

sudo apt-get install subversion

From the web document root, checkout MythWeb from SVN

cd /var/www
sudo svn co http://svn.mythtv.org/svn/branches/release-0-22-fixes/mythplugins/mythweb

This will create a subdirectory /var/www/mythweb containing the MythWeb software.

File System Permissions

Determine the user currently running Apache as this information will be required to set access to the MythWeb data.

ps aux | grep -i apache | awk ‘{ print $1 }’

This should display a list of user ID’s running Apache.


root
www-data
www-data
www-data
www-data
www-data
www-data
www-data
www-data
www-data
vince

The most frequently occurring ID is the one to use. So, www-data is the user running Apache on my system.

sudo chgrp -R www-data /var/www/mythweb/data
sudo chmod g+rw /var/www/mythweb/data

Create a subdirectory to hold TV Channel icons instead of storing them in User’s home directories.

sudo mkdir /var/www/mythweb/data/tv_icons
sudo chown www-data:www-data /var/www/mythweb/data/tv_icons

Required Apache Modules

Ensure the required Apache modules are installed by executing the following:-

sudo a2enmod rewrite
sudo a2enmod deflate
sudo a2enmod headers
sudo a2enmod auth_digest
sudo /etc/init.d/apache2 restart

Configuring Apache for MythWeb

Copy the sample Apache configuration file to the additional configuration directory ‘sites-available’.

sudo cp /var/www/mythweb/mythweb.conf.apache /etc/apache2/sites-available/mythweb.conf

Edit the file using your favourite text editor and make the following changes.


# If you intend to use authentication for MythWeb (see below), you will
# probably also want to uncomment the following rules, which disable
# authentication for MythWeb's download URLs so you can properly stream
# to media players that don't work with authenticated servers.
#
<LocationMatch .*/pl/stream/[0-9]+/[0-9]+>
Allow from all
</LocationMatch>
#
<LocationMatch .*/music/stream.php>
Allow from all
</LocationMatch>

Change the paths for the MythWeb directories in the following section:-

#
# CHANGE THESE PATHS TO MATCH YOUR MYTHWEB INSTALLATION DIRECTORY!  e.g.
#
#    /var/www
#    /home/www/htdocs
#    /var/www/html/mythweb
#    /srv/www/htdocs/mythweb
#
<Directory "/var/www/mythweb/data">
Options -All +FollowSymLinks +IncludesNoExec
</Directory>
<Directory "/var/www/mythweb" >

Configure authentication using htdigest, check how this works or not with .htaccess method and update the preparation stage accordingly

############################################################################
# I *strongly* urge you to turn on authentication for MythWeb.  It is disabled
# by default because it requires you to set up your own password file.  Please
# see the man page for htdigest and then configure the following four directives
# to suit your authentication needs.
#
AuthType           Digest
AuthName           "MythTV"
AuthUserFile       /var/www/htdigest
Require            valid-user
BrowserMatch       "MSIE"      AuthDigestEnableQueryStringHack=On
Order              allow,deny
Satisfy            any
#

Change the value for db_server from ‘localhost’ to the hostname of the MythTV Backend with the MySQL database. Ensure that the MythWeb host can resolve the hostname that you use. Edit /etc/hosts to include a valid entry for the backend if it can’t.

#
# Use the following environment settings to tell MythWeb where you want it to
# look to connect to the database, the name of the database to connect to, and
# the authentication info to use to connect.  The defaults will usually work
# fine unless you've changed mythtv's mysql.txt file, or are running MythWeb on
# a different server from your main backend.  Make sure you have mod_env enabled.
#
setenv db_server        "pc204"
setenv db_name          "mythconverg"
setenv db_login         "mythtv"
setenv db_password      "mythtv"

Change the email address to receive error alerts on to one that you currently use.

# If you want MythWeb to email php/database errors (and a backtrace) to you,
# uncomment and set the email address below.
#
#   setenv error_email       “alerts@vlara.co.uk
#

Enable mod_deflate

# Enable mod_deflate.  This works MUCH more reliably than PHP's built-in
# gzip/Zlib compressors.  It is disabled here because many distros seem not
# to enable mod_deflate by default, but I strongly recommend that you
# enable this section.
#
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
#
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/x-javascript
#
# This is helpful for mod_deflate -- it prevents proxies from changing
# the user agent to/from this server, which can prevent compression from
# being enabled.  It is disabled here because many distros seem not to
# enable mod_headers by default, but I recommend that you enable it.
#
Header append Vary User-Agent env=!dont-vary

Activate the configuration changes by executing the following commands:-

sudo a2ensite mythweb.conf
sudo /etc/init.d/apache2 reload

Network Access To MySQL from the DMZ

The MythWeb host in the DMZ will not have direct access to MySQL on the MythTV backend. The firewall will be blocking communication from the DMZ to the inside network. You need to open up ‘pin holes’ in the firewall to permit access from MythWeb to MythTV on ports 3306, 6543 and 6544. I created rules for TCP and UDP until I can test which are required. I suspect only TCP is required.

MySQL on the MythTV backend also needs to be reconfigured to allow access from remote hosts. Edit the file /etc/mysql/my.cnf and change the bind_address from 127.0.0.1 to the IP address of the MythTV host.

Testing MythWeb

Playing Flash Videos from the ‘Recorded Programs’ results in an error ‘Netstream not found’ this is most likely due to a problem with the firewall blocking the traffic between the browser and the server. Fortunately, Adobe have a very handy web page that tests the connection capability with their Flash Media Server that can be used to help diagnose the problem.

Create a firewall rule to allow port 1935 (macromedia-fcs) Real Time Messaging Protocol (RTMP) between MythWeb and MythTV.

A work in progress…

Categories
Hardware Nagios

Monitoring a Linksys WAG200G using SNMP

I have been using a Linksys WAG200G as a wireless access point since December 2007. I’m not using it for my broadband connection as I have a separate firewall and router already on my network. It has been running reliably without any problems since installed and it occurred to me that it had been some time since I had used the device’s administration page or reviewed Cisco’s patch history for it.

Using the web interface, the installed firmware was shown to be version 1.0.9, which was some way behind the current 1.1.9 release. I couldn’t find the release notes for any versions prior to 1.1.5 so I decided to upgrade the firmware to be certain that any known vulnerabilities had been patched.

After exploring the device’s web interface, I remembered that the little router supported SNMP. I didn’t have a NMS when it was installed so I had left this feature unconfigured. Now that I have a Nagios console it was time to activate the SNMP management. I set the device name to the same name that it’s IP resolves to in my DNS (wap101). I then set the monitoring IP address and trap target address to that of my NMS. Finally, I set the read community to public, and the write community to private.

From a command prompt on my NMS, I dumped a list of the management functions supported by the WAG200G using this command…

snmpwalk -v1 -c public 192.168.1.30 -m ALL .1

My Linksys uses 192.168.1.30 for it’s Ethernet interface. Change it to your device’s IP address if you are going to try it yourself. Redirecting the output to a file is useful for future reference.

A sample output of snmpwalk looks like this

IF-MIB::ifInErrors.1 = Counter32: 0
IF-MIB::ifInErrors.2 = Counter32: 0
IF-MIB::ifInErrors.3 = Counter32: 0
IF-MIB::ifInErrors.4 = Counter32: 0
IF-MIB::ifInErrors.5 = Counter32: 0

My WAG200G is only used as a WLAN access point, so I apologise now for not covering anything to do with monitoring ADSL or anything other than the Ethernet and WLAN interfaces in the Host and Service Definition file for my WAG200G. If you want to monitor more, just pick the relevant items required from the MIBs reported by snmpwalk and add them to your Nagios services. Think about the outputs and what conditions they need for alerts if any. Most of mine only need to alert if the result increases from zero. This is the list of services I am only interested in monitoring:-

  • PING
  • Uptime
  • eth0 IN Discarded Packets
  • eth0 IN Errors
  • eth0 IN Unknown Protocols
  • eth0 OUT Discarded Packets
  • eth0 OUT Errors
  • eth0 Operational Status
  • wlan0 IN Discarded Packets
  • wlan0 IN Errors
  • wlan0 IN Unknown Protocols
  • wlan0 OUT Discarded Packets
  • wlan0 OUT Errors
  • wlan0 Operational Status

I found that Nagios doesn’t like non-unique service descriptions, which is why my descriptions take the form shown above. Click here to view my Host and Services Definitions for the WAG200G.

The host definition inherits from the generic-switch template and looks like this…

# Define the switch that we'll be monitoring
define host{
use generic-switch ; Inherit default values from a template
host_name wap101 ; The name we're giving to this switch
alias Linksys WAG200G ; A longer name associated with the switch
address 192.168.1.30 ; IP address of the switch
hostgroups switches ; Host groups this switch is associated with
}

Each service inherits from the generic-service template and looks something like this…

# Monitor Port 4 (wlan0) number of errors in via SNMP
define service{
use generic-service ; Inherit values from a template
host_name wap101
service_description wlan0 IN Errors
check_command check_snmp!-C public -o ifInErrors.4 -c 0 -m IF-MIB
}

I used the documentation on check_snmp to prevent critical warnings for zero values (-c 0). In time, if any of my services start seeing errors I can change them to use a warning range and a critical range instead.

My Ubuntu 9.10 package install of Nagios was missing the command snmp_check. I added the following code to the bottom of my /etc/nagios-plugins/config/snmp.cfg to get SNMP working as the vital command was missing for some reason.

define command{
command_name check_snmp
command_line $USER1$/check_snmp -H $HOSTADDRESS$ $ARG1$
}

Categories
Hardware Ubuntu

Upgrading the CPU on a Dell GX240

2.6Ghz Celeron

My two recently acquired Dell GX240 PCs were surprisingly quick with the 1.6Ghz Pentium 4 processors and Ubuntu. However, after some research I discovered that the GX240 motherboard is capable of using a more powerful processor without having to change to faster RAM. A quick search on eBay located two used SL6VV (2.6Ghz Celeron) processors for £3.95 each (including postage!) and they were promptly purchased.

The upgrade itself is very easy. Simply open the case, flip up the green heat-sink shroud and unclip and remove the heat-sink. Release the socket ZIF lever and swap out the processor with the new one. Replace the heat-sink, clips and shroud, close the case and restart the PC. During the boot phase, press F2 to go into the BIOS setup. The main page will provide immediate confirmation that the Celeron has been recognised.

I bought a syringe of CPU heat-sink grease but I didn’t need it. The stock heat-sink had a thermally conductive sticky pad that stayed stuck to it instead of the processor. The pad was in good condition so I decided to reuse it to avoid trying to clean it off.

GX240 fan shroudUsing the CPU benchmark in BOINC, the results of the 1.6Ghz Intel Pentium 4 were…

778 floating point MIPS (Whetstone)
1644 integer MIPS (Dhrystone)

After installing the 2.6Ghz Intel Celeron the benchmark showed a substantial improvement…

1327 floating point MIPS (Whetstone) per CPU
3532 integer MIPS (Dhrystone) per CPU

Verdict

The performance of Ubuntu Desktop 9.10 running on a Dell GX240 with a 1.6Ghz Intel Pentium 4 and 512MB RAM is surprisingly good. Upgrading the CPU to a 2.6Ghz Celeron has made the old PC feel a little faster for most GUI applications that I use. I suspect a higher performance GPU would make a more noticeable improvement.

Since installing the faster processors, one of the GX240s will ‘freeze’ after a few hours of running. I suspect that the 2.6Ghz CPU is overheating as the stock heatsink is dependent on the shrouded case fan exhausting heat from the case. I am going to change the passive heatsink for a fan cooled version.

I bought another two SL6VV processors for £2.49 each and I am now on the lookout for a pair of Socket 478 coolers. Despite the small setback due to passive cooling, this upgrade was worth doing considering how cheap it was.

Categories
Ubuntu

World Community Grid Certificate Problem

I recently installed BOINC on one of my Ubuntu machines but it wouldn’t do any work for the World Community Grid (WCG). The message log showed ‘Scheduler request failed: peer certificate cannot be authenticated with known CA certificates’. I tracked the problem down on the BOINC website. It is caused by a missing digital certificate that is required by WCG but not included with Ubuntu’s BOINC distribution. Fortunately, the fix is very simple. Just download the missing certificate , copy the file to /var/lib/boinc-client , then restart BOINC.

The fault fix on the BOINC website