Monday, March 31, 2008

CDDIS Data Center

GPS Data Sets

The ftp archive of GPS data and products on the CDDIS has the following structure. The values listed in red are substitution codes within the directories and filenames defined in Table 2 below. Many GPS data and product files archived at the CDDIS are stored in UNIX compress format and end in a .Z. Software to uncompress these files can be found at http://www.gzip.org.

This information on the GPS ftp structure at the CDDIS is divided into several tables:

  • Table 1: Overview of the ftp structure for GPS data and products
  • Table 2: Detailed diagram of the directory structure for GPS data
  • Table 3: Detailed diagram of the directory structure for GPS products
  • Table 4: Substitution codes for directories and filenames
  • Table 5: Examples

Table 1. Overview of CDDIS FTP Structure for GPS Data and Products

/pub /data /campaign /location /yyyy
/daily /yyyy /ddd /yyt monuddd#.yyt.Z
/highrate /yyyy /ddd /yyt /hh monudddhmi.yyt.Z
/hourly /yyyy /ddd /hh monuddd#.yyt.Z
/satellite /satname /yyyy /ddd satcddd#.yyt.Z
/products igs00p02.erp.Z
igs95p02.erp.Z
igs96p02.erp.Z
/wwww cenwwwwd.typ.Z
igrwwwwd.typ.Z
IGSyyPww.typ.Z
igsyyPwwww.typ.Z
iguwwwwd_hr.typ.Z
/trop monuwwww.zpd.Z
/ionex /yyyy /ddd cengddd0.yyi.Z
/topex cengddd0.yyi.Z
jpltpx_yymmdd.dat.Z
/valid cenwwwww.yyv.Z
/latest /final cenwwwwd.typ.Z
/rapid igrwwwwd.typ.Z
/ultra iguwwwwd_hr.typ.Z
/leopp /champ /center
/jason /center
/tools
/trop /wwww monuwwww.zpd.Z
/yyyy CONTENT_yyyy_mm.Z
zpd_yyyy_mm.tar.Z
/nrt /wwww TROPSUM_wwwwd.ovw.Z
TROP_wwwwd_hh.ovw.Z
igs60_wwwwd_hh.ovw.Z
/trop_new /yyyy /ddd monuddd0.yyzpd.gz
MORE

Table 2. CDDIS FTP Structure for GPS Data

/pub
/gps
/data

/campaign
/daily
/highrate
/hourly
/satellite

/location
/yyyy
/yyyy
/yyyy
/satname
/yyyy
/ddd
/ddd
/ddd
/yyyy

/yyd
/yyg
/yym
/yyn
/yyo
/yys
/yyd
/yym
/yyn
/hh
/ddd
monuddd#.yyt.Z
/hh
/hh
/hh
monuddd#.yyt.Z
satcddd#.yyt.Z
monudddhmi.yyt.Z







GPS campaign data
Daily files of 30-second GPS data
15-minute files of 1-second GPS data
Hourly files of 30-second GPS data
Daily files of LEO GPS satellite data
GPS product files
Other data








Table 3. CDDIS FTP Structure for GPS Products

/pub
/gps
MORE
/data
/products
/wwww
/ionex
/latest
/trop
/trop_new
cenwwwwd.typ.Z
igrwwwwd.typ.Z
IGSyyPww.typ.Z
igsyyPwwww.typ.Z
iguwwwwd_hr.typ.Z
/trop
/ddd
/final
/rapid
/ultra
/wwww
/yyyy
/nrt
/yyyy
monuwwww.zpd.Z
cengddd0.yyi.Z
/topex /valid
cenwwwwd.typ.Z
igrwwwwd.typ.Z
iguwwwwd_hr.typ.Z
monuwwww.zpd.Z
CONTENT_yyyy_mm.Z
zpd_yyyy_mm.tar.Z
/wwww
/ddd
cengddd0.yyi.Z
jpltpx_yymmdd.dat.Z
cenwwwww.yyv.Z



TROPSUM_wwwwd.ovw.Z
TROP_wwwwd_hh.ovw.Z
igs60_wwwwd_hh.ovw.Z
monuddd0.yyzpd.gz














GPS data Weekly GPS product files Weekly GPS troposphere solutions by site Daily IONEX product files Daily TOPEX ionosphere validation files Weekly ionosphere validation files Most recent IGS product files Most recent IGS rapid product files Most recent IGS ultra-rapid product files Weekly GPS troposphere solutions by site Monthly GPS troposphere solutions Near real-time sub-daily troposphere solutions Daily troposphere solutions (new format) Other data













Table 4. Substitution Codes for Directories and Filenames

Code Description Range Example
yyyy 4-digit year 1992-present 2005
yy 2-digit year 92-present 05
wwww 4-digit GPS week number 0649-present 1303
ww 2-digit week of year 01-52 01
ddd 3-digit day of year 001-366 001
dd 2-digit day of month 01-31 01
d 1-digit day of week 0-6, 7=full week 0
mm 2-digit month of year 01-12 01
hh 2-digit UTC hour of day 00-23 00
h 1-character UTC hour of day a-x (a=00, b=01, ..., x=23) a
hr 2-digit hour for ultra-rapid products 00, 06, 12, 18 00
mi 2-digit minute of hour 00, 15, 30, 45 00
t type of data file d=compact RINEX observation data
g=RINEX GLONASS navigation data
m=RINEX meteorological data
n=RINEX GPS navigation data
o=RINEX observation data
s=teqc summary file
d
typ type of product file clk=satellite clock solutions
cls=clock combination summary report
erp=Earth rotation parameters
res=residuals file
sp3=SP3 orbit format
sp3c=extended SP3 orbit format
snx=station positions in SINEX format
ssc=sets of station coordinates
sum=analysis summary report
sp3
# file sequence number (typically 0) 0-9 0
monu monument name Full list gode
cen analysis center Full list igs
satname satellite full name N/A champ
satc satellite code N/A cham
location campaign location Full list maui

Table 5. Examples

GPS Data Type Location of Example File in CDDIS ftp Archive
Daily 30-second compact RINEX observation data for GODE on 01/01/2005 ftp://cddis.gsfc.nasa.gov/pub/gps/data/daily/2005/001/05d/gode0010.05d.Z
15-minute 1-second compact RINEX observation data for GODE for UTC hour 0 on 01/01/2005 ftp://cddis.gsfc.nasa.gov/pub/gps/data/highrate/2005/001/05d/00/gode001a00.05d.Z
Hourly 30-second compact RINEX observation data for GODE for UTC hour 0 on 01/01/2005 ftp://cddis.gsfc.nasa.gov/pub/gps/data/hourly/2005/001/00/gode001a.05d.Z
Daily LEO satellite compact RINEX observation data for CHAMP on 01/01/2005 ftp://cddis.gsfc.nasa.gov/pub/gps/data/satellite/champ/2005/001/05d/cham0010.05d.Z
GPS Product Type Location of Example File in CDDIS ftp Archive
Weekly IGS final orbit file for day 6 of GPS week 1303 (01/01/2005) ftp://cddis.gsfc.nasa.gov/pub/gps/products/1303/igs13036.sp3.Z
Weekly IGS cumulative reference frame station positions for first week of 2005 ftp://cddis.gsfc.nasa.gov/pub/gps/products/1303/IGS05P01.snx.Z
Weekly IGS reference frame station positions for GPS week 1303 (12/26/2004-01/01/2005) ftp://cddis.gsfc.nasa.gov/pub/gps/products/1303/igs05P1303.snx.Z
IGS rapid orbit file for day 6 of GPS week 1303 (01/01/2005) ftp://cddis.gsfc.nasa.gov/pub/gps/products/1303/igu13036_00.sp3.Z
IGS ultra-rapid orbit file for UTC hour 0 of day 6 of GPS week 1303 (01/01/2005) ftp://cddis.gsfc.nasa.gov/pub/gps/products/1303/igr13036.sp3.Z
Accumulated IGS ERP estimates since 1995 for the IGS final products ftp://cddis.gsfc.nasa.gov/pub/gps/products/igs95p02.erp
Accumulated IGS ERP estimates since 1995 for the IGS rapid products ftp://cddis.gsfc.nasa.gov/pub/gps/products/igs96p02.erp
Accumulated IGS ERP estimates since 2000 for the IGS SINEX products ftp://cddis.gsfc.nasa.gov/pub/gps/products/igs00p02.erp
Weekly troposphere solution for GODE for GPS week 1303 ftp://cddis.gsfc.nasa.gov/pub/gps/products/1303/trop/gode1303.zpd.Z
ftp://cddis.gsfc.nasa.gov/pub/gps/products/trop/1303/gode1303.zpd.Z
Daily troposphere solution (new format) for GODE on 01/01/2005 ftp://cddis.gsfc.nasa.gov/pub/gps/products/trop_new/2005/001/gode0010.05zpd.gz
Combined IONEX solution for 01/01/2005 ftp://cddis.gsfc.nasa.gov/pub/gps/products/ionex/2005/001/igsg0010.05i.Z
Ultra-rapid zenith path delay files for UTC hour 2 of day 6 of GPS week 1303 (01/01/2005) ftp://cddis.gsfc.nasa.gov/pub/gps/products/trop/nrt/1303/igs60_13036_02.tro.Z
Ultra-rapid zenith path delay overview files for day 6 of GPS week 1303 (01/01/2005) ftp://cddis.gsfc.nasa.gov/pub/gps/products/trop/nrt/1303/igs60_13036.ovw.Z

Sunday, March 30, 2008

How to write man pages?

I want to write some man pages for my own GPSF programs. Where can I get the detailed instructions and syntax for writing unix man pages?

wget help

GNU Wget 1.10.2, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
-V, --version display the version of Wget and exit.
-h, --help print this help.
-b, --background go to background after startup.
-e, --execute=COMMAND execute a `.wgetrc'-style command.

Logging and input file:
-o, --output-file=FILE log messages to FILE.
-a, --append-output=FILE append messages to FILE.
-d, --debug print lots of debugging information.
-q, --quiet quiet (no output).
-v, --verbose be verbose (this is the default).
-nv, --no-verbose turn off verboseness, without being quiet.
-i, --input-file=FILE download URLs found in FILE.
-F, --force-html treat input file as HTML.
-B, --base=URL prepends URL to relative links in -F -i file.

Download:
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits).
--retry-connrefused retry even if connection is refused.
-O, --output-document=FILE write documents to FILE.
-nc, --no-clobber skip downloads that would download to
existing files.
-c, --continue resume getting a partially-downloaded file.
--progress=TYPE select progress gauge type.
-N, --timestamping don't re-retrieve files unless newer than
local.
-S, --server-response print server response.
--spider don't download anything.
-T, --timeout=SECONDS set all timeout values to SECONDS.
--dns-timeout=SECS set the DNS lookup timeout to SECS.
--connect-timeout=SECS set the connect timeout to SECS.
--read-timeout=SECS set the read timeout to SECS.
-w, --wait=SECONDS wait SECONDS between retrievals.
--waitretry=SECONDS wait 1..SECONDS between retries of a retrieval.
--random-wait wait from 0...2*WAIT secs between retrievals.
-Y, --proxy explicitly turn on proxy.
--no-proxy explicitly turn off proxy.
-Q, --quota=NUMBER set retrieval quota to NUMBER.
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host.
--limit-rate=RATE limit download rate to RATE.
--no-dns-cache disable caching DNS lookups.
--restrict-file-names=OS restrict chars in file names to ones OS allows.
--user=USER set both ftp and http user to USER.
--password=PASS set both ftp and http password to PASS.

Directories:
-nd, --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
--protocol-directories use protocol name in directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.

HTTP options:
--http-user=USER set http user to USER.
--http-password=PASS set http password to PASS.
--no-cache disallow server-cached data.
-E, --html-extension save HTML documents with `.html' extension.
--ignore-length ignore `Content-Length' header field.
--header=STRING insert STRING among the headers.
--proxy-user=USER set USER as proxy username.
--proxy-password=PASS set PASS as proxy password.
--referer=URL include `Referer: URL' header in HTTP request.
--save-headers save the HTTP headers to file.
-U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION.
--no-http-keep-alive disable HTTP keep-alive (persistent connections).
--no-cookies don't use cookies.
--load-cookies=FILE load cookies from FILE before session.
--save-cookies=FILE save cookies to FILE after session.
--keep-session-cookies load and save session (non-permanent) cookies.
--post-data=STRING use the POST method; send STRING as the data.
--post-file=FILE use the POST method; send contents of FILE.

HTTPS (SSL/TLS) options:
--secure-protocol=PR choose secure protocol, one of auto, SSLv2,
SSLv3, and TLSv1.
--no-check-certificate don't validate the server's certificate.
--certificate=FILE client certificate file.
--certificate-type=TYPE client certificate type, PEM or DER.
--private-key=FILE private key file.
--private-key-type=TYPE private key type, PEM or DER.
--ca-certificate=FILE file with the bundle of CA's.
--ca-directory=DIR directory where hash list of CA's is stored.
--random-file=FILE file with random data for seeding the SSL PRNG.
--egd-file=FILE file naming the EGD socket with random data.

FTP options:
--ftp-user=USER set ftp user to USER.
--ftp-password=PASS set ftp password to PASS.
--no-remove-listing don't remove `.listing' files.
--no-glob turn off FTP file name globbing.
--no-passive-ftp disable the "passive" transfer mode.
--retr-symlinks when recursing, get linked-to files (not dir).
--preserve-permissions preserve remote file permissions.

Recursive download:
-r, --recursive specify recursive download.
-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite).
--delete-after delete files locally after downloading them.
-k, --convert-links make links in downloaded HTML point to local files.
-K, --backup-converted before converting file X, back up as X.orig.
-m, --mirror shortcut for -N -r -l inf --no-remove-listing.
-p, --page-requisites get all images, etc. needed to display HTML page.
--strict-comments turn on strict (SGML) handling of HTML comments.

Recursive accept/reject:
-A, --accept=LIST comma-separated list of accepted extensions.
-R, --reject=LIST comma-separated list of rejected extensions.
-D, --domains=LIST comma-separated list of accepted domains.
--exclude-domains=LIST comma-separated list of rejected domains.
--follow-ftp follow FTP links from HTML documents.
--follow-tags=LIST comma-separated list of followed HTML tags.
--ignore-tags=LIST comma-separated list of ignored HTML tags.
-H, --span-hosts go to foreign hosts when recursive.
-L, --relative follow relative links only.
-I, --include-directories=LIST list of allowed directories.
-X, --exclude-directories=LIST list of excluded directories.
-np, --no-parent don't ascend to the parent directory.

Mail bug reports and suggestions to .

Which option should be used? What will happen if the target file exists? It appears that wget will rename the exiting file to one with a ".1" suffix. For example, if target file bjfs0010.00d.Z exists, this file will be renamed to bjfs0010.00d.Z.1 by wget. Thus, it's not a good choice to specify no parameters for wget. Also, problems may arise when use "-c" options which means resuming the download. Although the source uncompressed file (bjfs0010.00d) is the same in different cache of data centers, the size of the compressed file (bjfs0010.00d.Z) is often different, which I supposed to be the result of using different compress programs (say, gzip, compress). To sum up, the "-nc" option is rather a good choice. It means if the file is already there, no need to resume or download it again, although this method will left some files unfinished.

Script to convert .??o.Z file to .??d.Z


#!/bin/sh
# sh_o2d
# convert all RINEX (.??o) files to compressed RINEX format (*.??d)

if [ $# -gt 0 ]; then
cd $1
fi
curdir=`pwd`

echo "*.??o -> *.??d"
echo " working in $curdir"

files=`find ${curdir} -name "????????.??o*"`
for file in $files
do
echo " converting.. $file"
cd `dirname ${file}`
#pwd
sh_rnx2crx -c y -d y -f ${file}
done

Script to convert SIO NEU time series into CATS format.

#!/bin/csh

# Name:
# sh_sio2cat
#

# Purpose:
#

# Example:
#

# Modifications:
#

# Algorigthm:
#

# Dependency:
#


set data_dir = /home/tianyf/data/cmonoc/timeseries/xyz_deoff
#set data_dir = /home/tianyf/cats_v3.1.2/cmonoc/cmonoc_xyz_catsfmt

set files = `find $data_dir -name "*.xyz"`
#set files = `find $data_dir -name "*.xyz.cat"`

#white noise only model
foreach file ( $files )
set ofile = "/home/tianyf/cats_v3.1.2/cmonoc/cmonoc_xyz-deoff_catsfmt/`basename $file`.cat"

echo "cat $file | awk '{print $1,$4,$5,$6,$7,$8,$9}' > $ofile "
cat $file | awk '{print $1,$4,$5,$6,$7,$8,$9}' > $ofile
end


Script to create sites map from GMT.

#!/bin/sh
#sh_sitemap
# Create site map from sites.defaults files using GMT
#
# Usage: sh_sitemap [ -expt EXPT ] [ -dir ${expt} ] [ -out ofile ]
# [ -check_dc dc.sites.file ]
# default to plot all sites in sites.defaults
#
# Required Files:
# ${expt}/tables/process.defaults
# ${expt}/tables/sites.defaults
# ${expt}/tables/itrfNN*.apr (defined process.defaults)
#
# Input coordiantes are from lfile. or *.apr file.
# lfile. format:
#Epoch 2000.0956: From file itrf00_nafd.apr
#GRAS GRAS GPS N43 33 45.22598 E 6 55 14.05892 6369273.3773 Ref. Epoch 1997.0000 GRAS_GPS
#TOUL TOUL GPS N43 22 7.06826 E 1 28 50.73220 6368233.4576 Ref. Epoch 1997.0000 TOUL_GPS
#7604 7604 GPS N48 13 0.23097 W 4 30 13.78000 6366324.4534 Ref. Epoch 1997.0000 7604_GPS
# apr file format:
#* SYSTEM 1 Coordinates in SYSTEM 2 Frame
#* Name X (m) Y (m) Z (m) Xdot Ydot Zdot Epoch
# GRAS_GPS 4581691.0120 556114.6800 4389360.6960 0.00261 0.02073 -0.00653 1997.0000
# TOUL_GPS 4627846.1280 119629.1780 4372999.7230 0.00232 0.02054 -0.00789 1997.0000
# 7604_GPS 4228877.0780 -333104.1790 4747181.0000 0.00384 0.02027 -0.00663 1997.0000

#Algorithm
# +Get sites names from ${expt}/tables/sites.defaults
# +Get sites coordinates from ${expt}/tables/${itrf}.apr
# ${irtf}.apr filename obtained from ${expt}/tables/process.defaults
# +Plot the sites coordinates with sites names as lables.


#Modification
# + SEP-14-2007 Check the site data availablility from Data Center site list file.
# say, sites.kasi

# initializing parameters
# default to plot all sites coordinates
expt=
# default expt root = current directory
expt_root=`pwd`
# output plot in current directory (hard-wired)
expt_out=`pwd`
# default output name
ofile=sitmap.ps
# default data center site file (to be KASI)
check_dc=site.kasi

while [ "$1" != "" ]; do
case $1 in
-dir)
expt_root=$2
;;
-expt)
expt=$2
;;
-out)
ofile=$2
;;
-check_dc)
check_dc=$2
;;
*)
echo "Invalide option: $1"
exit
;;
esac
shift 2
done

if [ ! -d ${expt_root}/tables ]; then
echo "No ${expt_root}/tables directory exist! Exiting..."
exit
fi


# get sites
file=${expt_root}/tables/sites.defaults
if [ ! -f ${file} ]; then
echo "No ${file} exist! Exiting..."
exit
fi
sites=`grep "^ " ${file} | grep ${expt} | awk '{print substr($1,1,4)}'`
echo "Using Sites: ${sites}"


# site coordiantes files
# use gapr_t_l to convert ${expt}/tables/itrfNN*.apr to old-style lfile.
#


# get apr file
file=${expt_root}/tables/process.defaults
coord_file=`grep aprf ${file}`
# There maybe serveal lines of apr files. Use the last one.
coord_file=`grep aprf ${file}`
coord_file=`echo ${coord_file} | awk '{ for (i=1;i<=NF;i++) {if (i==NF) print $i} }'`
coord_file=`sed 's/#.*//' ${file} | grep aprf | tail -1 | awk '{print $4}'`

#> awk '/aprf/' process.defaults
# set aprf = itrf00_nafd.apr
# set aprf = itrf05.apr
#> sed -n '/line/p' process.defaults
#> sed -n '/aprf/p' process.defaults
# set aprf = itrf00_nafd.apr
# set aprf = itrf05.apr
#> sed -n '/aprf/p' process.defaults | sed -n '$p'
# set aprf = itrf05.apr

coord_file=${expt_root}/tables/${coord_file}
echo ${coord_file}
#exit
# convert .apr to lfile.
#cd ${expt_root}/tables/
#pwd
gapr_to_l ${coord_file} .lf "" 2000.0
coord_file=.lf
#
i=0
tmp_llr=.llr
tmp_lbl=.lbl #labels (site names)
for site in ${sites}; do
echo "Searching coordinates for $site .."
coords=`grep -i ${site}_gps ${coord_file} | awk '{print $1,substr($0,19,14),substr($0,35,15),substr($0,18,1),substr($0,34,1)}' | awk '{ if($9 == "W") {printf("%f", -($5+$6/60+$7/3600))} else {printf("%f", ($5+$6/60+$7/3600))} ;print " "; if($8=="S"){printf("%f",-($2+$3/60+$4/3600))}else{printf("%f",$2+$3/60+$4/3600)} }'`
#echo $coords
#coords_lbl=`grep -i ${site}_gps ${coord_file} | awk '{print $1,substr($0,19,14),substr($0,35,15)}' | awk '{print ($5+$6/60+$7/3600) ,($2+$3/60+$4/3600), "12 0 1 CM ", $1}'` #ERROR
coords_lbl=`grep -i ${site}_gps ${coord_file} | awk '{print $1,substr($0,19,14),substr($0,35,15),substr($0,18,1),substr($0,34,1)}' | awk '{ if($9 == "W") {printf("%f", -($5+$6/60+$7/3600))} else {printf("%f", ($5+$6/60+$7/3600))} ;print " "; if($8=="S"){printf("%f",-($2+$3/60+$4/3600))}else{printf("%f",$2+$3/60+$4/3600)};print " 12 0 1 CM",$1 }'`

if [ "${coords}" != "" ]; then
if [ $i -eq 0 ]; then
echo ${coords} > ${tmp_llr}
echo ${coords_lbl} > ${tmp_lbl}
i=1
else
echo ${coords} >> ${tmp_llr}
echo ${coords_lbl} >> ${tmp_lbl}
fi
else
echo " -Not found."
fi

done
echo "___"
cat ${tmp_llr}
#exit

# plot sites names
#ranges=max/min
rng=`minmax -C ${tmp_llr}`
xmin=`echo ${rng} | awk '{print $1}'`
xmax=`echo ${rng} | awk '{print $2}'`
ymin=`echo ${rng} | awk '{print $3}'`
ymax=`echo ${rng} | awk '{print $4}'`
ix=`echo $xmin $xmax | awk '{print int(int(($2-$1)/10)/5)*5}'`
iy=`echo $ymin $ymax | awk '{print int(int(($2-$1)/10)/5)*5}'`
echo $ix ":" $iy

R=`minmax -I1 ${tmp_llr}`
echo $R

#pscoast $R -I1 -Na -Ggray -JM6i -Wthin -B.5g.5f.5/.5g.25f.25WSen -V -W > ${ofile}
pscoast $R -I1 -Na -Ggray -JM10i -Wthin -B${ix}/${iy}:."SITE MAP":WSEN -V -W -K > ${ofile}

psxy ${tmp_llr} -R -J -O -K -Sc.1i -Gyellow >> ${ofile}

echo "___"
cat ${tmp_lbl}
pstext -R -J -O -N -Gred ${tmp_lbl} >> ${ofile}


\rm -f ${tmp_llr} ${tmp_lbl} .lf fort.*

Script to convert .z to .Z

#!/bin/sh

#Convert the last letter .z to .Z.
#In Solaris 10 X86, when use -o foldcase to mount FAT32, all uppercase
#letters appear lowercase. And all directories will be shown in
#UPPERCASE if this keyword is not present.
#A solaris bug??

#

#files=`find "/data0/igs0/pub/products" -name "*.z"`
year=1992
while [ $year -le 2000 ]; do
files=`find "/data0/igs0/pub/rinex/${year}" -name "*.z"`
for file in $files; do
ofile=`echo $file | awk '{print substr($0,1,index($0,".z"))"Z"}'`
#echo $ofile
#echo $file
echo "mv $file $ofile"
mv $file $ofile
done
year=`expr $year + 1`
done

Script to download RINEX OBS directory from KASI.

#!/bin/sh

local_root=~/igs/pub/rinex
year=1997
while [ ${year} -le 2006 ]; do
doy=001
ndays=`ymd2ydoy ${year} 12 31 | awk '{print $2}'`
yy=`echo $year | awk '{print substr($0,3,2)}'`
while [ ${doy} -le ${ndays} ]; do
doy=`echo $doy | awk '{printf("%03d",$0)'}`
day_dir=${local_root}/${year}/${doy}
mkdir -p ${day_dir}
cd ${day_dir}
pwd
ncftpget -R ftp://nfs.kasi.re.kr/gps/data/daily/${year}/${doy}/${yy}d
doy=`expr $doy + 1`
done
year=`expr $year + 1`
done


Note: the size of the data will be so large.

A script to download IGS RINEX observations files from DC.

#!/bin/sh
# NAME
# sh_wget_rnx

local_root=~/igs/pub/rinex
local_root=/i/data.server/pub/rinex
local_root=/cygdrive/i/data.server/pub/rinex
local_root=/cygdrive/l/igs3/pub/rinex
local_root=/cygdrive/k/igs_data/pub/hfiles
#local_root=/cygdrive/i/igs_data/pub/hfiles
local_root=/cygdrive/j/igs2/pub/rinex
yearStart=2005
yearStart=1999
yearStart=2001
yearEnd=2002

doys=001
doye=`ymd2ydoy ${yearStart} 12 31 | awk '{print $2}'`

sites=lhaz,guao,shao,kunm,irkt,irkm,naga,tnml,taiw,aira,daej,khaj,suwn,ulab,yaka,kit3

# process command-line parameters
#####################################################
#COMMAND LINE PARAMETERS
#####################################################
while [ "$1" != "" ]
do
#echo $1
case $1 in
-dir)
cd $2
local_root=$2
;;
-yrs)
yearStart=$2
yr=`echo $yearStart | awk '{print substr($0,3,2)}'`
y=`echo $yearStart | awk '{print substr($0,4,1)}'`
ndays=`ymd2ydoy ${yearStart} 12 31 | awk '{print $2}'`
;;
-yre)
yearEnd=$2
;;
-doys)
doys=$2
;;
-doye)
doye=$2
;;
-sites)
sites=$2
;;

*)
echo "invalid options: $1"
exit 1
;;
esac
shift 2
done
#echo $sites
sites=`echo $sites | awk -F, '{for (i=1;i<=NF;i++) print $i}'`
#echo $sites

#exit

year=${yearStart}
while [ ${year} -le ${yearEnd} ]; do

if [ ${year} -eq ${yearStart} ]; then
doy=${doys}
else
doy=001
fi

if [ ${year} -eq ${yearEnd} ]; then
ndays=${doye}
else
ndays=`ymd2ydoy ${year} 12 31 | awk '{print $2}'`
fi

yy=`echo $year | awk '{print substr($0,3,2)}'`
while [ ${doy} -le ${ndays} ]; do
doy=`echo $doy | awk '{printf("%03d",$0)'}`
day_dir=${local_root}/${year}/${doy}
#mkdir -p ${day_dir}
cd ${day_dir}
pwd

echo "wget -nc --http-user=anonymous --http-password=tianyf@gmail.com http://garner.ucsd.edu/pub/hfiles/${year}/${doy}/higs2a.${yy}${doy}.Z"
for site in $sites; do
wget -nc --http-user=anonymous --http-password=tianyf@gmail.com http://garner.ucsd.edu/pub/rinex/${year}/${doy}/${site}${doy}0.${yy}d.Z
done
doy=`expr $doy + 1`
done
year=`expr $year + 1`
done


It seems that HTTP GARNER connections are much faster than FTP GARNER. Thus, I chose the former to download data. However, if you use ftp/ncftp to download files, the latter should be more friendly.

Saturday, March 29, 2008

How does the SIO hfile come out?

just do a rerun?

I want to generate my own version of "SIO" H-files.


P.S.
I think I may get the answer:
ftp://garner.ucsd.edu/pub/combinations/globl/${year}/${doy}

There are autcln.cmd, station.info, sestbl. and sittbl. files in those folders. These are enough to do a gamit run, just another two files (process.defaults and sites.defaults) needed. ;)

Thursday, March 27, 2008

How to use absolute antenna correction in GAMIT?

It the absolute antenna correction the default one in GAMIT processing? Or it still uses the relative correction as default.


P.S.
Actually, the current default setting is to use absolute phase center correction.

antmod.dat -> ../../tables/igs05_1400.atx


I got the clue from svnav.dat file:
Note: The SV antenna offsets in this file are the old nominal values,
superseded by the values used in the models distributed as ANTEX files by the IGS.
If antmod.dat points to an absolute PCV ANTEX files, the antmod.dat values
will override the svnav.dat values (see the p-file for confirmation).

IGS Workshop 2006

http://nng.esoc.esa.de/ws2006/

Friday, March 21, 2008

Google Earth in iTouch?

Will Google will be ported to Ipod Touch?