Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

purpose_id 29 not updated since mid-March #33

Closed
eileenpeppard opened this issue May 22, 2019 · 6 comments
Closed

purpose_id 29 not updated since mid-March #33

eileenpeppard opened this issue May 22, 2019 · 6 comments

Comments

@eileenpeppard
Copy link
Member

Hi, Guy.

Just one sensor had not been updated. ID 29,
Query string: ABSPATH:1:#netzero_weather_conditions/barometer_tn
Last timestamp in database 2019-03-16 10:20:00
I've checked Webctrl and it's there.
Not top priority but should get fixed. Thanks a million.

Eileen

@carlosparadis
Copy link
Member

@eileenpeppard can you confirm if this was the date where the server belly up due to lack of storage space? Just for reference to potential cause, this happened.

@eileenpeppard
Copy link
Member Author

Yes, it was the same time. For some reason it didn't catch up with the rest of the sensors.

@carlosparadis
Copy link
Member

Timestamp on state file was = 1552162500.0 (03/09/2019). Changed to 03/15/2019 @ 12:00am (1552608000). It didn't fix. I have no idea as it stands on what is going wrong. At this point, I think I will just move with @matthew-schultz the code to production at the end of next week, and get over with spending time on this code.

Also, when running manually, sometimes the code works, sometimes it doesn't:

running project: uhm-frog

querying egauge: frog-1-egauge...
querying egauge: frog-2-egauge...
egauge queries complete...

querying: ABSPATH:1:#frog2_room_sensors/zone_temp_1_tn
Traceback (most recent call last):
  File "/usr/local/bin/scrape-util", line 25, in <module>
    main()
  File "/usr/local/bin/scrape-util", line 22, in main
    run(proj=args['projects'],wrap=args['wrap'])
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 24, in run
    runit(project)
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 20, in <lambda>
    runit = lambda p: run_wrapped(p) if wrap else run_project(p)
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 54, in run_project
    state,data = acquire_data(project,config['acquire'],state)
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 88, in acquire_data
    substate,rows = scraper.acquire(project,config[method],substate)
  File "/usr/local/lib/scrape-util/src/acquire/webctrl.py", line 51, in acquire
    nonce[uid] = max(stamps) # set the new nonce value.
ValueError: max() arg is an empty sequence

When I first ran manually. Running again immediately after didn't throw an error.

@carlosparadis
Copy link
Member

carlosparadis commented May 22, 2019

Made another attempt, now moving timestamp to 1552731660, 2019-03-16 10:21:00, one minute ahead of the last reading timestamp reported by Eileen. After the last run, the state timestamp was 1552162500.0, which is 03/09/2019.

Scrape-utils threw another error:

querying egauge: e34110...

------------- warning -------------
  query failed with status code:  404
  check gauge if this problem persists
-----------------------------------

querying egauge: e786...

------------- warning -------------
  query failed with status code:  404
  check gauge if this problem persists
-----------------------------------

querying egauge: e34107...

------------- warning -------------
  query failed with status code:  404
  check gauge if this problem persists
-----------------------------------

querying egauge: e790...
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 137, in _new_conn
    (self.host, self.port), self.timeout, **extra_kw)
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 91, in create_connection
    raise err
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 81, in create_connection
    sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 560, in urlopen
    body=body, headers=headers)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 354, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3.5/http/client.py", line 1106, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request
    self.endheaders(body)
  File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python3.5/http/client.py", line 934, in _send_output
    self.send(msg)
  File "/usr/lib/python3.5/http/client.py", line 877, in send
    self.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 162, in connect
    conn = self._new_conn()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 146, in _new_conn
    self, "Failed to establish a new connection: %s" % e)
requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x7f2bfd3fdba8>: Failed to establish a new connection: [Errno 110] Connection timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 376, in send
    timeout=timeout
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 610, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 273, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='egauge790.egaug.es', port=80): Max retries exceeded with url: /cgi-bin/egauge-show?c&C&m&t=1530918000&f=1558505598 (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f2bfd3fdba8>: Failed to establish a new connection: [Errno 110] Connection timed out',))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/scrape-util", line 25, in <module>
    main()
  File "/usr/local/bin/scrape-util", line 22, in main
    run(proj=args['projects'],wrap=args['wrap'])
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 24, in run
    runit(project)
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 20, in <lambda>
    runit = lambda p: run_wrapped(p) if wrap else run_project(p)
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 54, in run_project
    state,data = acquire_data(project,config['acquire'],state)
  File "/usr/local/lib/scrape-util/src/core/runtime.py", line 88, in acquire_data
    substate,rows = scraper.acquire(project,config[method],substate)
  File "/usr/local/lib/scrape-util/src/acquire/egauge.py", line 21, in acquire
    raw = query(gauges[gid],starts[gid],stops[gid])
  File "/usr/local/lib/scrape-util/src/acquire/egauge.py", line 75, in query
    r = requests.get(uri.format(gauge),params=params)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 67, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 53, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 480, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 588, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 437, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='egauge790.egaug.es', port=80): Max retries exceeded with url: /cgi-bin/egauge-show?c&C&m&t=1530918000&f=1558505598 (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7f2bfd3fdba8>: Failed to establish a new connection: [Errno 110] Connection timed out',))

Let's see how many more I can get tonight. Data still does not upload.

@carlosparadis
Copy link
Member

carlosparadis commented May 22, 2019

Now attempting the reading timestamp suggested by Ryan on #29 (1554717600.0 (Monday, April 8, 2019 12:00:00 AM GMT-10:00) on this comment.

Solution worked and data finally was populated on the database. However, there is due to this now a data gap. If according to @eileenpeppard the latest timestamp was 2019-03-16 10:20:00, no data will be found between then and before 04/08/2019 12 AM HST. According to @ryantanaka this timestamp was the one when Webctrl was back on.

@eileenpeppard please verify how much missing data we have due to this. I don't have access to webctrl from home.

@carlosparadis
Copy link
Member

Eileen confirmed it was fine. Closing this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

2 participants