Recommended reading:
http://www.postgresql.org/docs/tutorial/index.html
Modules in this pack, in general, provide the following functions:
Bugs and other possible
caveats reports are welcome.
Alex Shevlakov,
sixote@yahoo.com
==================================================================
by
Markus Neteler and Alex Shevlakov
The following text shall
introduce you to interface usage. This text is subject to change...
Let's start.
Enter GRASS 5. Of course you need to have a location defined and some data you want to import here.
------------------------------------------------------------------------
A) First see, if PostgreSQL
is working - check for errors
------------------------------------------------------------------------
We begin with looking at the list of existing databases:
g.select.pg -l
Error: select Postgres:connectDB()
failed: Is the postmaster running and
accepting TCP/IP(with
-i) connections at '130.77.22.66' on port '5432'?
See for the log file:
cat /var/log/postgresql.log
It tells us:
-> No data directory -- can't proceed.In this case we have to install further packages (here: SuSe Linux, names may be different in your installation):
/usr/lib/pgsql/bin/postmaster does not find the database system. Expected
to find it in the PGDATA directory "/var/lib/pgsql/data", but unable to open
file with pathname "/var/lib/pgsql/data/base/template1/pg_class".
Restart the "postmaster" (the daemon listening for db-queries):
cd /sbin/init.d/rc2.d/Is postmaster there? You should see it in the process list.
./S25postgres startps -ax |grep postmaster
g.select.pg -l
If you get the error:
Error: select Postgres: User authentication failedyou need to setup the PostgreSQL user's list. Otherwise jump to letter B in the text.
suSet a password for user "postgres" (this is needed first time only!):
passwd postgresNow login as user "postgres":
[set the password]
exit
su - postgres
and add yourself as PostgreSQL-user:
createuser netelerNow you are again back in GRASS environment: Again we try:Enter user's postgres ID or RETURN to use unix user ID: 601 -> <return>
Is user "neteler" allowed to create databases (y/n) y
Is user "neteler" allowed to add users? (y/n) n
createuser: neteler was successfully addedexit
g.select.pg -lThis indicates that you PostgreSQL environment is o.k.
The following databases are in the Unix catalogue:
template1
------------------------------------------------------------------------
B) Using GRASS/PostgreSQL:
creating a database table
------------------------------------------------------------------------
Say, you have a SHAPE-file
set: humus.shp, humus.shx and humus.dbf.
The file *.shp will be imported
into GRASS, the file *.dbf into PostgreSQL.
Now you have to use "createdb"
do create database tables.
First we create an empty
table:
createdb humusThis new table we select in GRASS:
g.select.pg database=humusTo destroy a database use: "destroydb humus" (PostgreSQL 6.x) or "dropdb humus" (PostgreSQL 7.x).
In this first step we import
the plain attribute table only without importing geographical features.
To import the Dbase-table
into PostgreSQL enter:
pg.in.dbf in=humus.dbfYou will be asked: Additionally dump to ASCII file (enter full Unix name or hit <return> for none):
Executing create table humus (AREA float4,PERIMETER float4,G2_UEB09_
int8,G2_UEB09_I int8,STONR int4,BOTYP text,HORIZ text,BODART text,HUMUS
float4,SKELETT text)
The table is imported into PostgreSQL now. Note: The vectors are not yetr imported!
------------------------------------------------------------------------
C) Getting simple table
statistics
------------------------------------------------------------------------
Now we want to get some
information:
For teaching real life we
start with an error...
g.stats.pg table=humus col=BOTYPThe reason is that BOTYP is a text field (yes, see above in the output of pg.in.dbf!). Of course we cannot calculate statistics from letters.
Error: connect Postgres:ERROR: No such function 'min' with the specified attributes
g.stats.pg table=humus col=HUMUSAha, quite nice. It tells us about humus contents of soil patches in percent.
Min, Max, Mean
-------------------------------------
0, 2.81, 1.72372
destroydb humus # on PostgreSQL 6.x
dropdb humus # on PostgreSQL 7.x------------------------------------------------------------------------
Now it is becoming more amazing!
We will import our SHAPE file:
Using "verbose=9" we get
information about import progress. First step is to create an empty Postgres
table:
createdb humusNOTE: You don't have to import the .dbf table manually. It is done by v.in.shape.pg automatically (many thanks to Alex Shevlakov)!
v.in.shape.pg input=humus.shp verbose=9 pgdump=humus
See if the GRASS map there:
g.list vectFirst we have to build the map topology:
----------------------------------------------
vector files available in mapset geosum:
humus
----------------------------------------------
v.support m=humus o=buildStart a GRASS monitor:
d.mon x0...and display the map:
d.vect humusQuery it using PostgreSQL:
d.what.v.pg -f m=humus tab=humus col=HUMUSInterim solution:-> Now we see that v.in.shape[.pg] only supports lines
but not areas yet. We hope for an upgrade...
Here we can use
g.column.pg -v table=humus
| columnname | type | length |
----------------------------------------------------
| area | float4 | 4 |
| perimeter | float4 | 4 |
| g2_ueb09_ | int8 | 8 |
| g2_ueb09_i | int8 | 8 |
| stonr | int4 | 4 |
| botyp | text | var |
| horiz | text | var |
| bodart | text | var |
| humus | float4 | 4 |
| skelett | text | var |
As you can see you get information about the field types within the PostgreSQL table.
------------------------------------------------------------------------
F) Selecting table entries
and displaying on the map
------------------------------------------------------------------------
To select map features and display the the selected areas/lines in GRASS Monitor, you can use
d.vect.pgA SQL-statement is required to select the features of interest.
select stonr from humus where HUMUS > 1.2
[change "humus" to your
table name and "HUMUS" to a column existing
in your table]
Now query:
d.vect.pg -f -s sql.query map=humus color=redAlternate method: Write query into command line:
d.vect.pg -f key=HUMUS tab=humus where='HUMUS>1.2' map=humus col=redExecuting
key is the column name,
tab the table, and where the statement.
"-f" fills the selected
areas, col defines their colors.
Now you see the selected areas in the GRASS Monitor.
------------------------------------------------------------------------
G) Query example 1
------------------------------------------------------------------------
There's a map of digitized
forest stands called "terney_id" (say three hundred in the region); each
plot has unique rec_id in database table info_terney.
Now we'd like to calculate
correlations between hights and diameters of trees in plots with main species
Pinus koraensis (type_id <21), with age more than 100 years, growing
in southern slopes and lying along specific routes. (Remember: If you forgot
the column names of your table, use g.column.pg - see above)
First thing we highlight red all plots that satisfy these conditions. We write a sql-statement ASCII file "query1.sql":
select rec_id from info_terney where type_id <21 and age > 100 and expo ~ 's'Then we use the query command:
d.vect.pg -f -s query1.sql map=terney_id color=redThen, we pick these plots in d.what.v.pg, following along our routes:
d.what.v.pg -f map=terney_id tab=info_terney col=rec_id color=green fillcolor=gray hv=hRecommendation: Use the TclTkGRASS with PostgreSQL support. Then you can easily copy-paste results.
------------------------------------------------------------------------
H) Import of ARC ungenerate
files
------------------------------------------------------------------------
Note: Currently import is
limited to interactive mode.
In this example we have a
landuse map in ungen format (one file containing lines, one file containing
label points and one file containing label text = attributes).
The files will be found
in your current directory (opposite to v.in.arc which expects them in $LOCATION/arc/
!).
Copy PAT.dbf (or AAT.dbf)
there, too, and rename to a reasonable name comparing to your other ungen
files.
Note: PostgreSQL does
not allow field names containing "-" (ilde-id is forbidden, ilde_id is
accepted)!
Our files are:
il_nutz.pnt il_nutz.pol il_nutz.dbfNow we start to import:
createdb il_nutz (only if you do not have a database created)Now run v.support on the imported vector map.v.in.arc.pg
GRASS vector file: landuse (the new name of map in GRASS)
Coverage type: polygon
Neatline: no (a neatline is a line surrounding the map)
Lines file ARC/INFO: il_nutz.pol
Labels file ARCINFO: il_nutz.pnt
Name of .dbf file to be imported: il_nutz.dbf
Admin/normal user dump mode: <return>
Additionally dump to ASCII file: <return>Table il_nutz successfully copied into Postgres. Congratulations!
------------------------------------------------------------------------
I) Reclass of vector
map
------------------------------------------------------------------------
v.reclass.pg
[...]
------------------------------------------------------------------------
J) Problems importing DBF-tables into PostgreSQL?
------------------------------------------------------------------------
The most common problem ( as i run into it too often) while
converting *.dbf files to Postgres with pg.in.dbf, v.in.arc.pg and
v.in.shape.pg is format dismatch - pg_atoi() ERROR - saying there's something
in the field declared as int (or float) that does not seem like number,
such as "***", "NO", "infrared spectrum", etc.
Alas, postgres is very restrictive (unlike we people who type in this stuff). My approach to this (as i still want to import the .shp from Arcview that my collegue sent me from ArcviewLand) - exit the program - it would already have done so-, and rerun the module saying "no" to dbf-to-postgres dump.okay, after we have imported the coverage , let's use pg.in.dbf and say "yes" to question "Do you want additionally DUMP to ASCII?".
Now, find the line(s) that spoils your breakfast (hopethere are not many), kill'em all and then - use psql and COPY TABLE from '/home/user/user.stuff'. That's it. Besides, dumping to ascii file and doing things about it before importing to postgres is a must when there are some weird encoding chars in text fields.
Alex