Jon says:
I don't know of anything within
BinaryReader
which will read an array of integers, I'm afraid. If you read into a byte array you could then use Buffer.BlockCopy
to copy those bytes into an int[]
, which is probably the fastest form of conversion - although it relies on the endianness of your processor being appropriate for your data. Have you tried just looping round, calling
BinaryReader.ReadInt32()
as many times as you need to, and letting the file system do the buffering? You could always add a BufferedStream
with a large buffer into the mix if you thought that would help. Marc is of the opinion:
int[] original = { 1, 2, 3, 4 }, copy;
byte[] bytes;
using (var ms = new MemoryStream())
{
using (var writer = new BinaryWriter(ms))
{
writer.Write(original.Length);
for (int i = 0; i < original.Length; i++)
writer.Write(original[i]);
}
bytes = ms.ToArray();
}
using (var ms = new MemoryStream(bytes))
using (var reader = new BinaryReader(ms))
{
int len = reader.ReadInt32();
copy = new int[len];
for (int i = 0; i < len; i++)
{
copy[i] = reader.ReadInt32();
}
}
Although personally I'd just read from the stream w/o
BinaryReader
. Actually, strictly speaking, if it was me I would use my own serializer, and just:[ProtoContract]
public class Foo {
[ProtoMember(1, Options = MemberSerializationOptions.Packed)]
public int[] Bar {get;set;}
}
since this will have known endianness, handle buffering, and will use variable-length encoding to help reduce bloat if most of the numbers aren't enormous.
No comments:
Post a Comment