1.- Pending schema change requests: =================================== #60 (P. Lombard): Units table: "integral of velocity squared" is not a meaningful description. ----------------------------------------------------------------- Several different length and time combinations could be used for "integral of velocity squared". Is the velocity in cm/sec, or m/sec? Is the integral with respect to time, or distance? ----------------------------------------------------------------- ------------------------------------------------------------------------ ==> Discussed. Caltech tasked to find out what are the exact units. Based on Hiroo's reply the unit of IOVS should be 'cm^2 * sec'; will be stored as 'cmcms' in the database. ------------------------------------------------------------------------ 2.- New schema change requests: =============================== #10 (D. Given): Add new column "nread" to NETMAG table. ----------------------------------------------------------------- Specify number of readings used for magnitudes (Could be different than NETMAG.NSTA). ----------------------------------------------------------------- ------------------------------------------------------------------------ ==> Discussed. Doug G. to come up with a strawman and send it to the Standards group: The Schema Change Working Group needs your input on a small question with large implications. Every magnitude in the dbase has an attribute called "NSTA". Most programs write the number of *observations* (amps, codas) contributing to the magnitude in this field. One program (TriMag) writes the number of *stations*. So the field name doesn't match its most common usage. This will be confusing to future developers. The WG considered just changing TriMag to write # of observations and leaving the misleading column name (NSTA) as it is. This may cause difficulties as the software propagates out to other networks. The WG is now proposing to "fix" the situation by doing the following: * add a new field called NOBS for # observations * retroactively rewrite NSTA and populate NOBS in the dbase to reflect this change * change all codes to use the new NOBS field and interpret NSTA as # stations This change would require changes to several codes. For example, TriMag, data import and load programs, Jiggle, STP, dbselect, EWmag2CISN, magPref package, alarm rules, and more. QUESTION: Do you think it is worth the effort to change the code to fix the 'misnamed' NSTA field and record both the number of observations and the number of stations? ------------------------------------------------------------------------ #12 (D. Given): Add boolean column "corrected" to ASSOCAMM table. ----------------------------------------------------------------- Specify if ASSOCAMM.MAG has been corrected. ----------------------------------------------------------------- ------------------------------------------------------------------------ ==> Refused. Clarify documentation: magnitude includes correction. ------------------------------------------------------------------------ #19 (D. Given): Add "timequality" to WAVEFORM. ----------------------------------------------------------------- Need to store waveform time quality (Currently must retrieve SEED timeseries data and read headers to get this info). ----------------------------------------------------------------- ------------------------------------------------------------------------ ==> Discussed. Doug G. to go back to Egill for more information. ------------------------------------------------------------------------ #18 (D. Given): Add field "actual_rate" to table WAVEFORM. ----------------------------------------------------------------- Specify actual waveform sample rate (WAVEFORM.SAMPRATE is only the expected rate). ----------------------------------------------------------------- ------------------------------------------------------------------------ ==> Not discussed. ------------------------------------------------------------------------ 3.- Possible use of SVN for tracking schema related documentation: ================================================================== * DDL scripts (SQL): a) Schema creation (PI,WF,HT,IR,SIS): - Should we just store the logical model of the schemas? NC & SC physical models are different. b) Alternate tables (RT, post processing): - Does SC already store them in svn? - NC should do the same (ChannelMap, AssocWfRc, ...). c) Views: - NC & SC have different versions of some views (md_magparms_view, ...). d) Packages: - Does SC already store them in svn? - Berkeley should do the same (Truetime, ...). * ER/Studio files (HTML). ------------------------------------------------------------------------ 2008/03/21 We decided in a 1st pass to store only the logical views of the DDL scripts. Ellen volunteered to come up with a draft for the SVN structure. 2008/04/25 Ellen came up with a strawman for the repository structure. ----------------------------------------------------------------- I just spoke with Rae Yip concerning using SVN to store the database creation scripts. Rae tells me that the scope of this SVN repository is to store source code and not site-specific configurations. So it seems that we would only check scripts concerning relational properties (and not physical or table properties) into this repository. Here is a strawman for these scripts. Current SVN repository structure for database scripts: DB ->trunk ->branches (there is a leap second dev branch - currently only storedprocedures is under this branch) Under "->trunk" ->storedprocedures ->tables ->types ->views ->snapshots Other scripts that are site-specific: 1. sequences 2. users 3. grants 4. table site-specific storage parameters ----------------------------------------------------------------- 2008/05/30 Ellen started copying over core schema tables into SVN. ----------------------------------------------------------------- I've started checking table creation scripts into SVN - DB/trunk. These scripts are based on the work to make the CISN build and have no references to tablespace names. I'll try to keep the cisn-commit emails to a minimum. I'm making subdirectories under tables for each of the schema groups we have - parametric, waveform, hardware tracking, application, instrument response. Allan Walter had checked in a number of table creation scripts in the past directly under "trunk", I'll touch base with him about moving those scripts to the correct subdirectory. ----------------------------------------------------------------- 2008/07/11 Ellen copied all of the core tables into SVN. However, some of the tables are SC specific and don't reflect the SCWG approved requests. The group agreed that the version of the tables in SVN should reflect all of the approved changes, regardless of whether those changes have been implemented in NC or SC. ------------------------------------------------------------------------