Red Oasis
Subscriptions: 2
Total pages: 59 | First page | Last known page | RSS
Homepage: http://www.red-oasis.com/
Added on: 2009-08-17 21:12:52
Comic status (since 2019-08-18): Hiatus
Categories: genre:sci-fi
| # | Page |
|---|
Actions
- Edit information
- View in Piperka Reader
- View on Piperka Map
- Open ticket
- Hiatus/completion status
- Claim comic
Crawl errors
The last 5 crawl errors during the last 30 days. Having this empty doesn't necessarily imply that there isn't something wrong with the crawler. I'll go through these eventually but I don't mind if you ask me to check whether the crawler's doing the right thing.
| Page order | Time | URL | HTTP status | |
|---|---|---|---|---|
| 58 | 2026-04-29 11:02:28 | http://red-oasis.com/comic/noctis_020/ | HttpExceptionRequest Request { host = "red-oasis.com" port = 80 secure = False requestHeaders = [("User-Agent","piperka.net/1.0")] path | |
| 58 | 2026-04-28 15:02:35 | http://red-oasis.com/comic/noctis_020/ | HttpExceptionRequest Request { host = "red-oasis.com" port = 80 secure = False requestHeaders = [("User-Agent","piperka.net/1.0")] path | |
| 58 | 2026-04-27 19:02:49 | http://red-oasis.com/comic/noctis_020/ | HttpExceptionRequest Request { host = "red-oasis.com" port = 80 secure = False requestHeaders = [("User-Agent","piperka.net/1.0")] path | |
| 58 | 2026-04-26 22:02:47 | http://red-oasis.com/comic/noctis_020/ | HttpExceptionRequest Request { host = "red-oasis.com" port = 80 secure = False requestHeaders = [("User-Agent","piperka.net/1.0")] path | |
| 58 | 2026-04-26 02:03:04 | http://red-oasis.com/comic/noctis_020/ | HttpExceptionRequest Request { host = "red-oasis.com" port = 80 secure = False requestHeaders = [("User-Agent","piperka.net/1.0")] path | |