我们正在使用RavenDB中的Stream功能在两个数据库之间加载,转换和迁移数据,如下所示:
var query = originSession.Query<T>(IndexForQuery);
using (var stream = originSession.Advanced.Stream(query))
{
while (stream.MoveNext())
{
var streamedDocument = stream.Current.Document;
OpenSessionAndMigrateSingleDocument(streamedDocument);
}
}
问题是其中一个集合有数百万行,我们继续按以下格式接收IOException
:
Application: MigrateToNewSchema.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.IO.IOException
Stack:
at System.Net.ConnectStream.Read(Byte[], Int32, Int32)
at System.IO.Compression.DeflateStream.Read(Byte[], Int32, Int32)
at System.IO.Compression.GZipStream.Read(Byte[], Int32, Int32)
at System.IO.StreamReader.ReadBuffer(Char[], Int32, Int32, Boolean ByRef)
at System.IO.StreamReader.Read(Char[], Int32, Int32)
at Raven.Imports.Newtonsoft.Json.JsonTextReader.ReadData(Boolean, Int32)
at Raven.Imports.Newtonsoft.Json.JsonTextReader.ReadStringIntoBuffer(Char)
at Raven.Imports.Newtonsoft.Json.JsonTextReader.ParseString(Char)
at Raven.Imports.Newtonsoft.Json.JsonTextReader.ParseValue()
at Raven.Imports.Newtonsoft.Json.JsonTextReader.ReadInternal()
at Raven.Imports.Newtonsoft.Json.JsonTextReader.Read()
at Raven.Json.Linq.RavenJObject.Load(Raven.Imports.Newtonsoft.Json.JsonReader)
at Raven.Json.Linq.RavenJObject.Load(Raven.Imports.Newtonsoft.Json.JsonReader)
at Raven.Json.Linq.RavenJToken.ReadFrom(Raven.Imports.Newtonsoft.Json.JsonReader)
at Raven.Client.Connection.ServerClient+<YieldStreamResults>d__6b.MoveNext()
at Raven.Client.Document.DocumentSession+<YieldQuery>d__c`1[[System.__Canon, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].MoveNext()
at MigrateToNewSchema.Migrator.DataMigratorBase`1[[System.__Canon, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].MigrateCollection()
at MigrateToNewSchema.Program.MigrateData(MigrateToNewSchema.Enums.CollectionToMigrate, Raven.Client.IDocumentStore, Raven.Client.IDocumentStore)
at MigrateToNewSchema.Program.Main(System.String[])
这种情况在流媒体中发生了相当长的一段时间,当然这种情况下会发生瞬时连接问题(需要几个小时才能完成)。
然而,当我们重试时,因为我们使用Query
,我们必须从头开始。因此,如果在整个Stream
期间出现连接失败,那么我们必须再次尝试,然后再次尝试,直到它端到端工作。
我知道您可以将ETag
与流一起使用以在某个时间点有效地重新启动,但是使用Query
执行此操作没有重载,我们需要过滤要迁移的结果并指定正确的收藏。
因此,在RavenDB中,有没有办法改善连接的内部弹性(连接字符串属性,内部设置等)或有效地&#34;恢复&#34;一个关于错误的流?
答案 0 :(得分:2)
根据@StriplingWarrior的建议,我使用Data Subscriptions重新创建了解决方案。
使用这种方法,我能够遍历所有200万行(尽管每个项目的处理次数少得多);这里有2点,当我们尝试使用Streams实现相同的逻辑时会有所帮助:
IObserver<T>
必须成功完成才能设置此确认。 测试环境是RavenDB 3.0数据库(本地计算机,作为Windows服务运行),默认设置针对200万条记录的集合。
生成虚拟记录的代码:
using (IDocumentStore store = GetDocumentStore())
{
store.Initialize();
using (var bulkInsert = store.BulkInsert())
{
for (var i = 0; i != recordsToCreate; i++)
{
var person = new Person
{
Id = Guid.NewGuid(),
Firstname = NameGenerator.GenerateFirstName(),
Lastname = NameGenerator.GenerateLastName()
};
bulkInsert.Store(person);
}
}
}
然后订阅这个集合是一个例子:
using (IDocumentStore store = GetDocumentStore())
{
store.Initialize();
var subscriptionId = store.Subscriptions.Create(new SubscriptionCriteria<Person>());
var personSubscription = store.Subscriptions.Open<Person>(
subscriptionId, new SubscriptionConnectionOptions()
{
BatchOptions = new SubscriptionBatchOptions()
{
// Max number of docs that can be sent in a single batch
MaxDocCount = 16 * 1024,
// Max total batch size in bytes
MaxSize = 4 * 1024 * 1024,
// Max time the subscription needs to confirm that the batch
// has been successfully processed
AcknowledgmentTimeout = TimeSpan.FromMinutes(3)
},
IgnoreSubscribersErrors = false,
ClientAliveNotificationInterval = TimeSpan.FromSeconds(30)
});
personSubscription.Subscribe(new PersonObserver());
while (true)
{
Thread.Sleep(TimeSpan.FromMilliseconds(500));
}
}
注意PersonObserver
;这只是IObserver的基本实现,如下所示:
public class PersonObserver : IObserver<Person>
{
public void OnCompleted()
{
Console.WriteLine("Completed");
}
public void OnError(Exception error)
{
Console.WriteLine("Error occurred: " + error.ToString());
}
public void OnNext(Person person)
{
Console.WriteLine($"Received '{person.Firstname} {person.Lastname}'");
}
}