Skip to main content
Indiana Wesleyan University Support Knowledge Base

Tree of Life

Extracts data from Colleague, transforms the data into files to meet Tree of Life specifications and transfers them via sftp to Tree of Life.

Details

This export runs on the DataExports server.

Additional notes

Contacts

IT:  Konrad Willmert

Vendor:  Nathan Mazellan

National and Global Customer:  Dennis Martin, Sue Melton

IWU Marion Customer:  Cindy Ketcham

Scheduling

Run Time(s): Hourly every day at 1 minute past the hour.

Final Target: 

  • transfer.treeoflifebooks.com
  • SFTP (SSH) 
  • ssh-rsa 2048 3f:78:6d:3d:07:c5:59:6c:b4:35:9f:e4:7e:60:10:e9
    • RES files
      • Path: /home/indwes/data_files
      • iwu_course.csv
      • iwu_customer.csv
      • iwu_enrollment.csv
      • iwu_section.csv
    • NONRES files
      • Path: /home/indwes-nr/data_files
      • iwu_nr_course.csv
      • iwu_nr_customer.csv
      • iwu_nr_enrollment.csv
      • iwu_nr_section.csv

Query method

All queries utilize SQL scripts.

  1. Pre-built T-SQL filter snippets are gathered from Colleague table X.PARAMETERS.  The records all start with TOL.SQL and are rebuilt when customers save out of XTNP.
  2. Seven separate SQL scripts located in the TOL subdirectory are retrieved, the filter snippets are added, and the scripts are executed sequentially to load data into staging tables located in WarehouseDB.exports.
  3. These staging tables are then copied to compressed history tables in the same database with a timestamp matching the archive zip and the log files.
  4. Main SQL query (in TOL.cs) is assembled based on res or nonres arguments and executed.
  5. Results are transformed in memory to match TOL layout specifications.
  6. Final data is written to CSV files
  7. Files and process stats are checked for problems
  8. Files are finally uploaded.

Data tables

All data tables are located in WarehouseDB.exports

Staging data prior to filters

  • TOL_ACTIVE_PROGRAMS_STAGE
  • TOL_ADDRESS_INFO_STAGE
  • TOL_COURSE_SECTIONS_STAGE
  • TOL_FACULTY_STAGE
  • TOL_STUDENT_ROOMS_STAGE
  • TOL_STUDENT_SECTIONS_STAGE
  • TOL_STUDENTS_STAGE

History data tables per build of staging data

  • TOL_ACTIVE_PROGRAMS_HIST
  • TOL_ADDRESS_INFO_HIST
  • TOL_COURSE_SECTIONS_HIST
  • TOL_FACULTY_HIST
  • TOL_STUDENT_ROOMS_HIST
  • TOL_STUDENT_SECTIONS_HIST
  • TOL_STUDENTS_HIST

Logging and checks

The following information is retained to help troubleshoot in case problems are encountered.

  • Export script text log files are kept indefinitely.
  • Archived (zipped) file sets which were uploaded to TOL are kept for 30 days.
  • ExportManager logs contain detailed results of each execution including all SQL statements generated, the number of results, and any warnings or errors.
  • ExportManager keeps run history metadata in WarehouseDB.exports.ExportManagerRunHistory indefinitely.
  • Staging data table history tables in WarehouseDB.exports with the snapshot of each staging rebuild are kept indefinitely.

The following checks are automatically performed with each execution in this order.  If anything goes wrong, in each case an email is automatically sent to developers with relevant warnings or errors that also appear in the text log files.

Staging Build Step

  1. If staging tables fail to rebuild, the entire process halts.
  2. If any staging table query executes for longer than 8 minutes, it times out and the entire process halts.

Main Query Step

  1. If the main query fails, the entire process halts.
  2. If the main query executes for longer than 10 minutes, it times out and the entire process halts.
  3. If the main query returns no results, a warning is logged, but the process continues (it later halts due to checks below).

Transformation Step

  1. If any part of the data transformation step fails, the entire process halts.

File Writing Step

  1. If the main query returned no results, an error is logged and the entire process halts.
  2. If the main query returned fewer or more result sets than expected, a warning is logged, but the process continues (it later halts due to checks below).
  3. If any file could not be written to disk, an error is logged, but the process continues (it later halts due to checks below).
  4. If fewer than four files were written to disk, the entire process halts.

Validation Step

  1. If the total size of the files on disk varies from the previous run by more than 80%, the entire process halts.
    • If the previous run failed, preventing comparison, this fact is logged, but the process continues.
  2. If the total size of the files on disk varies from the previous run by more than 20%, a warning is logged but the process continues.

Upload Step

  1. If the number of files uploaded is less than the number of files expected, an error is logged.
  2. **CHECK DISABLED: If the remote server file hashes differ from the local file hashes meaning they were corrupted during transfer, an error is logged and the files are removed from the server.

** This check is not supported by Tree of Life's SFTP server. 

  • Was this article helpful?