|
From: | David Gray |
Subject: | [MIT-Scheme-devel] reading a TCP socket |
Date: | Tue, 28 Apr 2015 14:55:46 +0300 |
I’m reading some data over a raw TCP socket and the server program sends me 0d and what I read is 0a I’ve used both read-string! and read-char and experience the same result. Is there some character encoding default that I need to override or some binary mode? Cheers David
[Prev in Thread] | Current Thread | [Next in Thread] |