Rainfall interface file format #185
-
I'd like to write a rainfall interface file outside of SWMM to facilitate use of hundreds or thousands of rain gages. The rain.c documentation explains the format, but I end up with an unreadable file. When I compare my file with one saved by SWMM, there appears to be a difference after the 14-byte file header and before the gage metadata section. Perhaps this stems from my misunderstanding of what is meant by "(MAXMSG+1 (=80) bytes)". Given that MAXMSG=1024, this would appear to be equivalent to (1025 (=80) bytes), which I cannot understand. Should the station ID be 80 bytes? Is the length of the gage metadata section 14 + 92 x number of gages? Should the starting byte for the first rain data record be 14 + 92g + 1? Is there example code that writes a rainfall interface file from a text rainfall file?
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
The Station ID that you write to the file has to be a character array of length MAXMSG+1. Since MAXMSG is defined as 1024, the array has to be dimensioned to 1025. The "(=80) " comment was probably left over from an earlier version where MAXMSG was only 79. Also you must write the entire 1025 bytes of the Station ID to the file, not just whatever the length of the actual Station ID is. From @LRossman. I can share a python snippet for writing it if you are still encountering issues. |
Beta Was this translation helpful? Give feedback.
-
Another helpful bit from @LRossman "You have to place a null character (CHAR(0)) after the last character in the Station ID, and maybe pad the rest of the 1025 characters with null as well to be safe." (I suspect this applies for Fortran, which defaults to spaces in character strings, while C leaves unused elements as nulls.) |
Beta Was this translation helpful? Give feedback.
-
@MitchHeineman, I have provided the python snippet below. from datetime import datetime
import pandas as pd
import struct as st
def datetime2ole(date):
"""
Convert a datetime object to an OLE automation date.
:param date:
:return:
"""
OLE_TIME_ZERO = datetime(1899, 12, 30)
delta = date - OLE_TIME_ZERO
datetime_out = float(delta.days) + (float(delta.seconds) / 86400)
return datetime_out # 86,400 seconds in day
def string_to_null_terminated_bytes(s: str, max_length: int):
"""
Convert a string to a null terminated byte string
:param s:
:param max_length:
:return:
"""
return bytes(s, 'ascii') + b'\x00' * (max_length - len(s))
with open('rain.rff', 'wb') as f:
# Read the data from the file
data = pd.read_csv(
filepath_or_buffer='rain.dat',
header=0,
sep=r'\s+',
names=['Station', 'Year', 'Month', 'Day', 'Hour', 'Minute', 'Rainfall'],
comment=';'
)
data.set_index(pd.to_datetime(data[['Year', 'Month', 'Day', 'Hour', 'Minute']]), inplace=True)
# filter out zero rainfall
data = data[data['Rainfall'] > 0]
group = data.groupby('Station')
num_stations = len(group)
f.write(string_to_null_terminated_bytes('SWMM5-RAIN', 10))
# write the number of stations as 4 Byte integer
f.write(st.pack('i', num_stations))
station_index = 0
current_position = 10 + 4 + num_stations * (1025+4+4+4)
date_value_step = 12
for station, df in group:
end_position = current_position + date_value_step * len(df)
f.write(string_to_null_terminated_bytes(station, 1025))
f.write(st.pack('i', 900))
f.write(st.pack('i', current_position))
f.write(st.pack('i', end_position))
current_position = end_position + 1
for station, df in group:
for index, row in df.iterrows():
f.write(st.pack('d', datetime2ole(index)))
f.write(st.pack('f', row['Rainfall'])) |
Beta Was this translation helpful? Give feedback.
The Station ID that you write to the file has to be a character array of length MAXMSG+1. Since MAXMSG is defined as 1024, the array has to be dimensioned to 1025. The "(=80) " comment was probably left over from an earlier version where MAXMSG was only 79. Also you must write the entire 1025 bytes of the Station ID to the file, not just whatever the length of the actual Station ID is. From @LRossman.
I can share a python snippet for writing it if you are still encountering issues.