The solution is to export the table data to a file before making changes, and then after re-starting the application after making changes, reload the data from the files created previously.
The types of items which prevent recovery if changed are:
- Table Name
- Primary Key fields
- Snapshot Parallelism count
- Add a column in between other columns
- Remove a column
- Change data type of a column
- Change the order of columns
Note that after export, in order to re-import the data, equivalent changes may need to be made in the CSV files so that column order and data types match for import.
Basic Steps
1. Start with a fully populated LiveView table .
2. Run the LiveView application and confirm it has started (engine log reports: "*** All tables have been loaded. LiveView is ready to accept client connections.").
3. Obtain the LiveView Client URI
C:\>epadmin --servicename=A.lv display engine
[A.lv] LiveView Fragment Client URI = lv://mysystem:11080
4. Run the data export command.
lv-client -c -u lv://mysystem:11080 'SELECT * FROM orders ORDER BY rid ASC limit 1000000' > orders.csv
In this example, the "orders" table is exported to the "orders.csv" file. The "rid" column is the Primary Key which is unique for each row.
Repeat for each table. Do not export Aggregate tables which are based on other tables. Aggregate tables will be re-calculated as the new data is imported.
The limit of one-million assumes the table is holding fewer than 1M rows, increase this value to equal to or greater than the current table row count. If you find you cannot export all the rows to a single file, use a smaller limit value and download in sections. The ORDER BY statement allows you to use a WHERE expression for subsequent downloads to divide the data into separate downloads with non-overlapping primary key values. Check what the last Primary Key value was in the previous export and use it to select the next set of rows, for example:
lv-client -c -u lv://127.0.0.1:11080 "SELECT * FROM orders WHERE rid>500000 ORDER BY rid ASC limit 500000" > orders2.csv
5. Make changes to the table and re-run the application.
6. Import the data from the CSV files into each table.
To import each CSV data file back into the table use command:
lv-client -u lv://{hostname:port} publish {tablename} < {tablename}.csv
Replace the bracketed values as necessary.
Considerations and Troubleshooting
Export is best done outside business hours or during scheduled down-time so that the table data is not changing while these commands are running. Import may be done as long as the new primary key values from live data are not likely to be overridden by the primary key values in the CSV files. If this cannot be guaranteed, stop new data loading until the CSV files are completely read into table and confirmed.
There are two expected warning messages that this command reports which do not affect loading the data into the table:
Type "quit;" or Enter to exit
rid,orderid,ts,sku,item,EventArrivalTS
(1) line: 1 field: rid value: rid exception: Field rid: Invalid long value: "rid"
(2) line: 12 Expected 6 fields, got 1 line:# Rows = 10
The warnings marked above as (1) and (2) may be ignored. These represent non-data header and footer rows that should not be loaded.
Note that the first line, "
Type "quit;" or Enter to exit" is reported because the command expects interactive input. This may be ignored.
The second line reports the current table column names for reference.
Errors like the following indicate that the table
was not loaded:
rid,orderid,ts,sku,item,EventArrivalTS,extra_column
line: 1 Expected 7 fields, got 6 line:rid,oderid,ts,sku,item,EventArrivalTS
You will need to edit the CSV file to make it compatible with the modified table by adding or removing a column as needed. The Linux command-line utilities '
sed', '
awk', '
cut' are useful for bulk reformatting text files when the same change needs to be applied to every line. To supply an empty value for a field, use the value "
null" (without quotes).