Installing a working pyenv on RHEL/CentOS 5.x

January 19, 2015

RHEL/CentOS 5.x has long since lost its freshness, but some of us are still running servers with it, and can do so for several more years before its end of life.

Perhaps, like me, you have a need to run a more modern version of Python than 5.x installs by default. I recently found pyenv, and it looked to fit my needs perfectly, as I didn’t want to mess with the system version or build custom RPM’s.

Once I installed the build requirements, I used the handy pyenv-installer project to get it up and running, then ran the simple command to install Python 2.7.8. Unfortunately, things hit a bump with this error:

Error message:
subprocess.CalledProcessError: Command '['wget', 'https://pypi.python.org/packages/source/s/setuptools/setuptools-7.0.zip', '--quiet', '--output-document', '/tmp/python-build.20141210170309.3741/Python-2.7.8/setuptools-7.0.zip']' returned non-zero exit status 1

After a bit of digging, I found that this was actually due to a bug in version 1.11 of wget. As far as I could tell, this issue was not fixed upstream until the 1.12 release, and CentOS 5.x is frozen at 1.11.

I decided it was worth building a custom wget RPM package for my 5.x servers to get past this issue. After setting up my RPM build environment, I headed over to rpm.pbone.net to locate a suitable source RPM. wget-1.12-4.fc14.src.rpm ended up suiting my needs – the newer versions of wget RPMs had some build dependencies that were a bit awkward to fulfill, and 1.12 would solve my problem, so…

Per http://wiki.centos.org/HowTos/RebuildSRPM, the –nomd5 switch is needed when installing newer fedora source RPMs:

rpm --nomd5 -ivv wget-1.12-4.fc14.src.rpm

From there it was simply a matter of building the RPM:

cd /usr/src/redhat/SPECS
rpmbuild -bb wget.spec

Then installing it, by finding the wget-1.12-4 RPM file in one of the subdirectories of /usr/src/redhat/RPMS, changing to that directory, and running:

yum --nogpgcheck localinstall wget-1.14-3.x86_64.rpm

After this, pyenv install [version] should work as advertized…Python for the modern age!

Automating sysctl deployments using sysctl.d on RHEL/CentOS 5.x, 6.x, 7.x

January 8, 2015

I’ve fallen in love with automated server deployments in the last year, with my primary weapon being Salt.

One of the corner cases I’ve run into is adding sysctl settings specific to a feature set. For example, when a server needs Redis installed, I want to add the following kernel optimization via sysctl:

    vm.overcommit_memory = 1

It’s sloppy to add this to /etc/sysctl.conf – too hard to maintain in a modular fashion. Wouldn’t it be nice if there was a place we could drop a file with that sysctl setting in it, which would be automatically read on boot? This would enable adding and removing multiple sysctl settings a breeze to automate.

Well, it turns out that RHEL/CentOS does have this support via /etc/sysctl.d. While only RHEL/CentOS 7.x sports the directory out of the box, all three versions provide access to it via init scripts, and anything placed in /etc/sysctl.d will be read on boot, provided that the networking init script’s start action is called (it’s enabled by default).

Unfortunately, this is a bit of an odd placement for triggering a reload of the sysctl settings. I also wanted the ability to only reload the sysctl settings as part of a feature installation on a running server.

The path to get this feature turned out to be pretty short. /etc/init.d/functions contains an apply_sysctl function which handles all the dirty work of completely reloading all sysctl settings, including those placed in /etc/sysctl.d. This extremely short wrapper script does the job:

Armed with that script, I simply use Salt to automatically install it to /usr/local/bin on all servers, and call it any time a file in /etc/sysctl.d is added, removed, or modified.

Nginx logs in JSON format

July 14, 2012

I’ve recently decided that it’s a good idea to output server logs in JSON format. To this end, today I took some time to figure out how to do this for Nginx. The log_format parameter is the one you want to use, I simply added another named format to the http section of nginx.conf, which then allows the named format to be used in any other config file. Here’s what I whipped up – this is just the default main format ported to JSON:

log_format  json  '{'
                    '"remote_addr": "$remote_addr",'
                    '"remote_user": "$remote_user",'
                    '"time_local": "$time_local",'
                    '"request": "$request",'
                    '"status": $status,'
                    '"body_bytes_sent": $body_bytes_sent,'
                    '"http_referer": "$http_referer",'
                    '"http_user_agent": "$http_user_agent",'
                    '"http_x_forwarded_for": "$http_x_forwarded_for"'
                  '}';

I formatted it one row per parameter in the config file, as it’s easier for me to read, but Nginx will concatenate all those separate strings into one line in the log file. Once this is done, use the format in any of places where it’s accepted, for example:

access_log /path/to/file/access.log json;

For more details on all this stuff, check out the online documentation for Nginx’s logging module.

Resend Postfix messages stuck in mail queue to another address

July 6, 2012

Update 2015-02-19: Based on suggestions from David Keegel, I’ve tweaked the script to properly hold/unhold messages, and include the sender’s email address in the message envelope.

Ever had somebody give you a bad email address, and the messages pile up in your Postfix mail queue? Even if you have the right address to send the stuck emails to, Postfix (at least on CentOS 5.x based on all my research) provides no easy way to:

  • Resend all those messages to the new address
  • Remove the old stuck messages

We needed this functionality badly for our business, where we unfortunately get bad email addresses given to us all the time. So, using my marginal bash skills and several hours of my time, I whipped up the script below to automate this process.

Caveat: I’ve tested this script extensively with Postfix on CentOS 5.x, but nowhere else, really. So test first, use at your own risk!

db-query-assistant package released now available via npm

April 28, 2012

Today I’m happy to announce the official public release of db-query-assistant.

For those who haven’t read my initial post on the module, here is a quick summary:

  • High-level library for node database drivers.
  • Configurable connection pooling.
  • Issue multiple simultaneous queries, and get all results back in a callback when the last query completes.
  • Issue queries in series, getting the results for each previous query back before executing the next one.
  • Issue transactional queries, with automatic rollback on query failure.

This release includes a fairly comprehensive unit test suite for the core library, using the awesome mocha test framework. I decided against unit tests for the drivers, as I feel those would be better served by integration tests (which I may add in the future).

The module has been published to the npm registry, and a simple npm install db-query-assistant will now do the trick.

Sampler API 6.x 1.1 released

March 25, 2012

The 6.x-1.1 release of Sampler API is now out and ready for download. This has a number of important bugfixes, as well as the addition of locking support so the same metric cannot be run again while already running.

Most of the fixes came out of the work I’m doing to get metrics collection fully deployed on drupal.org.

Sampler API, a metrics framework for Drupal

March 22, 2012

About a year ago, I wrote a Drupal module called Sampler API, which is a framework for data collection and storage with built-in Views integration (credit to mikey_p for the Views code).

From the project page: “The Sampler API allows modules to easily collect and store calculated pieces of data. It’s designed primarily to assist in collection, storage, and display of metrics.”

The module hasn’t seemed to get much attention, and I’m guessing at least part of the reason was that it never has had an official release. Well today that changes. ;)

A 6.x-1.0 release is now available, and the 7.x development release is also working pretty well – an official release for that branch should follow soon.

Environment management script for FreeSWITCH

March 18, 2012

As a FreeSWITCH consultant, one of the things I do regularly is set up new environments for working with clients. Yesterday I decided it would be nice to automate the process of basic management for these installations, and the freeswitch-env script was born.

Here’s the basics:

Usage: freeswitch-env <install|uninstall|list> <env-name>
  env-name: The name of the envelope to create. Will be appended to
  'freeswitch-' to create the final envelope name

Install goes through the standard configure/make/install steps, plus handles installing the default sounds, music on hold, and config files. If configured, it will also set up an init script for starting/stopping the install (I’ve only tested this on Redhat, but others would probably work with minor tweaking).

The script is fairly configurable, allowing you to specify a custom source code location, base install path for environments, user to run FreeSWITCH as, etc.

It’s released as part of my new freeswitch-utils package, where I’ll be adding other handy scripts, etc. that I find useful when working with FreeSWITCH.

Updates/improvements to db-query-assistant

March 16, 2012

I woke up this morning with a desire to better abstract the driver code for db-query-assistant mentioned in this post, and after a few hours of tinkering, profit!

So today is the release of version 0.2.0 of this helper library. All database driver code is now pluggable, and I’ve started with drivers for db-mysql and db-drizzle of the nodejsdb project.

Other node-based database connection modules would probably be pretty easy to provide drivers for, so feel free to submit pull requests to add new ones!

Helper library for node database drivers Connection pooling, multi/series querying

March 14, 2012

Summary

Update

I’ve renamed the respository for this software package, it’s now available here. Links in this post have been updated accordingly.

Today I’m releasing a nifty helper library I wrote to ease the pain of aync database queries in node.js. It provides the following features on top of node-db drivers:

  • Configurable connection pooling.
  • Issue multiple simultaneous queries, and get all results back in a callback when the last query completes.
  • Issue queries in series, getting the results for each previous query back before executing the next one.

At this point it only supports the db-mysql driver, as I haven’t abstracted the code to make the other drivers pluggable. Would be a simple thing to do though, feel free to submit a pull request!

It’s also possible that this could be abstracted further to provide functionality across different node-based database drivers.

This library was pretty easy to code up thanks to the awesome async and generic-pool modules.

Installation

Code

Available on github. No npm package yet, when the API is fully hardened it will be packaged.

Dependencies

Usage

See the README.

Older posts