In order to use a version that is as much compatible as possible with
Amazon Redshift you need version
You can build it in few minutes.
sudo apt-get install make gcc libreadline-dev zlib1g-dev -y
Choose the directory where you want to install it, I recommend using
export PGROOT =/usr/local/
Then download it and build it with the following commands
tar xf postgresql-8.0.2.tar.gz
./configure --prefix = $PGROOT
sudo make install
Do not forget to add
$PGROOT/bin to your
$PATH: it should be already
ok if you used the
PGROOT recommended above.
Consider set your environment to point to your main database. For example I added to my
~/.bashrc something like
export PGHOST =mydb-instance.cd274s5bo4aq.eu-west-1.redshift.amazonaws.com
export PGPORT =5439
export PGDATABASE =mydb
export PGUSER =mydb_user
Redshift hostname. You can find it in the AWS console, look for
Cluster Endpoint in your Redshift instance Configuration tab: it is something like
5439, is default Redshift password.
Your database name.
Your database user name.
Do not use
PGPASSWORD to set credentials, it is strongly recommended to use a pgpass file (read below).
It is a good choice to use a
pgpass file to store passwords.
chmod 600 ~/.pgpass
and add lines in the following format
echo $PGHOST: $PGPORT: $PGDATABASE: $PGUSER:password >> ~/.pgpass
.pgpass and change password.
Now you can connect to the database just launching
It is possible to have multiple lines since environment variables
are read first, then password is grabbed from pgpass file.
To display also the database user in the prompt, put the following line in your
psql prompting for more choices.
How to connect via SSH from AWS CloudShell to EC2 instance Just a quick list of actions and tricks to write down how to connect from AWS CloudShell to an EC2 instance
AWS Lambda npm scripts AWS Lambda is great! But even better, there is no need to add any framework on top for management. You can use npm scripts.
Optimize Maxmind database loaded on Redshift using Analytical functions If you need to associate an IP address to a country or a city probably you will use MaxMind data. If you load it in a relational database you will write a SQL statement that joins your traffic data with MaxMind data, which can be really heavy. This is an attempt to optimize queries by reducing the number of MaxMind data rows.
Redshift tips I am using Redshift since two years ago, and as every database it has its SQL dialect and its secrets. I will write here everything I discover and it is worth to be annotated.
Getting started with PostGIS PostGIS is a PostgreSQL extension that adds support for geographic objects allowing location queries to be run in SQL.
S3 bucket public by default How to make an Amazon S3 bucket public by default.
S3 to RedShift loader Load data from S3 to RedShift using Lambda, powered by apex. Our goal is: every time the AWS Elastic load balancer writes a log file, load it into RedShift.
How to drop a user on Netezza You are trying to drop a user but Netezza complains cause it "owns objects"? This article shows how to solve this problem.
How to collect Netezza history Collecting your Netezza query history is a mandatory step before optimization. Read this article to know how to collect Netezza history easily.
Use nzpassword! How to authenticate securely on Netezza.
sqlplus tips Tricks and tips about your everyday Oracle sqlplus usage.
How to install DBD::Oracle I am going to put here all the steps required to install DBD::Oracle CPAN module, which is not a straightforward installation. The purpose is to reduce headache and turn it into a repeatable process.