Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for LineFileDocs Bottleneck/Performance Improvements #325

Merged
merged 53 commits into from
Aug 22, 2020

Conversation

NightOwl888
Copy link
Contributor

@NightOwl888 NightOwl888 commented Aug 11, 2020

This fixes a bottleneck (see #261) caused by unzipping the line docs file in RAM (~15MB) and then selecting a random line in the file. The .NET GZipStream is not seekable, so this was done by copying the entire contents into a MemoryStream first. This happened during a significant number of the tests (~20%), and happened in each one of those tests.

The fix was to set up the test framework to unzip the file to a temp file on the test machine. This happens in 1 of 3 different ways:

  1. If LineFileDocs is used directly in a class that does not specify LuceneTestCase.UseTempLineDocsFile = true, LineFileDocs will unzip the file before it is used (per instance of the class) and deleted when it is disposed.
  2. If LuceneTestCase.UseTempLineDocsFile = true is specified in the test fixture, the file will be unzipped in the BeforeClass() method and deleted in the AfterClass() method.
  3. If the test project makes heavy use of this file, adding a subclass of LuceneTestFrameworkInitializer to the test project (outside of all namespaces) will cause the file to be unzipped only once for all of the tests in that project and deleted after the last test is finished.

There are also several other patches in this PR:

  • The seek behavior of LineFileDocs was reverted back to Lucene's original implementation, which has revealed some (potential) false positives in some of the ICU tests. A BufferedStream was added to improve performance.
  • Removed unnecessary variable allocations.
  • Fixed a bug with the Nightly, Weekly, Slow, and AwaitsFix attributes so they will wait until NUnit runs the initialization code before running.
  • Added a DeadlockAttribute to time out tests that we are now seeing threading contention issues with after improving raw speed. This is to ensure that they will fail in the CI environment if they actually deadlock and also can be used to filter out these tests during runs.
  • Simplified some expressions to make them simpler to maintain.
  • Commented out dead code and unnecessary variable declarations that were carried over from Java.
  • Fixed a bug in the ICUTokenizer where it was calling System.Char.IsWhiteSpace() when it should have been calling ICU4N.UChar.IsWhiteSpace() to ensure it is reading the correct version of ICU.
  • Changed implementation of DisposableThreadLocal to that of RavenDB, with permission from its maintainers. (closes The design and implementation of a better ThreadLocal<T> #251)
  • Fixed Facet DrillSideways search NullReferenceException in DrillSideways.Search - ReqExclScorer.GetCost #274
  • Fixed locking and initialization behavior throughout Lucene.Net.Facet
  • Fixed a boxing issue with Facet cache
  • Changed the priority queue type to be a struct instead of a class
  • Many other updates to Lucene.Net.Facet to make it .NET-ified

…t is GZip, unzip to a temp file instead of using a MemoryStream to save RAM
…e: Unzip LineFileDocs at the class level instead of in each test
…ek to line break by reading the stream in chunks instead of 8 bytes at a time.
…is zipped at the class level, so we don't have so much overhead with selecting a random line in the file.
…esWriter::AddBinaryField(): Removed unnecessary array allocation
…w subclasses to specify whether to extract the LineDocsFile to a temp file at the class level. Changed LuceneTestFrameworkInitializer to automatically extract LineDocsFile to a temp file if it is overridden and the specified file is zipped. Also fixed all attributes in LuceneTestCase to correctly allow LuceneTestFrameworkInitializer to be called by NUnit before attempting to call LuceneTestFrameworkInitializer.EnsureInitialized().
… NUnit to unzip the LineDocsFile only 1 time per test run instead of on each class or each test.
… same way that Lucene does, except using a BufferedStream to improve performance.
…varying widely in completion time due to threading contention or deadlock
…ace() rather than System.Char.IsWhiteSpace(), which may return different results.
…nd added some optimizations to Core/TestFactories
…asses to using the Analyzer.NewAnonymous() method inline.
@Shazwazza
Copy link
Contributor

I've run all tests locally twice with these changes. The first time was fine but the second time yielded one error:

TestSnapshotDeletionPolicy.TestMultiThreadedSnapshotting

Error

 TestMultiThreadedSnapshotting
   Source: TestSnapshotDeletionPolicy.cs line 352
   Duration: 16 ms

  Message: 
    System.Exception : Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
    ', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
    Data:
      OriginalMessage: System.Exception: Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
    ', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
     ---> Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException: Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
    ', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
       at Lucene.Net.Index.DocumentsWriterFlushControl.AddFlushableState(ThreadState perThread) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\DocumentsWriterFlushControl.cs:line 711
       at Lucene.Net.Index.DocumentsWriterFlushControl.MarkForFullFlush() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\DocumentsWriterFlushControl.cs:line 644
       at Lucene.Net.Index.DocumentsWriter.FlushAllThreads(IndexWriter indexWriter) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\DocumentsWriter.cs:line 738
       at Lucene.Net.Index.IndexWriter.PrepareCommitInternal() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\IndexWriter.cs:line 3557
       at Lucene.Net.Index.IndexWriter.CommitInternal() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\IndexWriter.cs:line 3750
       at Lucene.Net.Index.IndexWriter.Commit() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\IndexWriter.cs:line 3709
       at Lucene.Net.Index.TestSnapshotDeletionPolicy.ThreadAnonymousInnerClassHelper2.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs:line 414
       at J2N.Threading.ThreadJob.SafeRun(ThreadStart start)
       at J2N.Threading.ThreadJob.<.ctor>b__4_0()
       at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
       at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
       at System.Threading.ThreadHelper.ThreadStart()
    
       --- End of inner exception stack trace ---
       at Lucene.Net.Index.TestSnapshotDeletionPolicy.ThreadAnonymousInnerClassHelper2.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs:line 419
       at J2N.Threading.ThreadJob.SafeRun(ThreadStart start)
    
      ----> Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException : Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
    ', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
  Stack Trace: 
    ThreadJob.Join()
    TestSnapshotDeletionPolicy.TestMultiThreadedSnapshotting() line 374
    --DebugAssertException
    DocumentsWriterFlushControl.AddFlushableState(ThreadState perThread) line 711
    DocumentsWriterFlushControl.MarkForFullFlush() line 644
    DocumentsWriter.FlushAllThreads(IndexWriter indexWriter) line 738
    IndexWriter.PrepareCommitInternal() line 3557
    IndexWriter.CommitInternal() line 3750
    IndexWriter.Commit() line 3709
    ThreadAnonymousInnerClassHelper2.Run() line 414
    ThreadJob.SafeRun(ThreadStart start)
    <.ctor>b__4_0()
    ThreadHelper.ThreadStart_Context(Object state)
    ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
    ThreadHelper.ThreadStart()

Full output

Test Name:	TestMultiThreadedSnapshotting
Test Outcome:	Failed
Result Message:	
System.Exception : Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
Data:
  OriginalMessage: System.Exception: Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
 ---> Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException: Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
   at Lucene.Net.Index.DocumentsWriterFlushControl.AddFlushableState(ThreadState perThread) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\DocumentsWriterFlushControl.cs:line 711
   at Lucene.Net.Index.DocumentsWriterFlushControl.MarkForFullFlush() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\DocumentsWriterFlushControl.cs:line 644
   at Lucene.Net.Index.DocumentsWriter.FlushAllThreads(IndexWriter indexWriter) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\DocumentsWriter.cs:line 738
   at Lucene.Net.Index.IndexWriter.PrepareCommitInternal() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\IndexWriter.cs:line 3557
   at Lucene.Net.Index.IndexWriter.CommitInternal() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\IndexWriter.cs:line 3750
   at Lucene.Net.Index.IndexWriter.Commit() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net\Index\IndexWriter.cs:line 3709
   at Lucene.Net.Index.TestSnapshotDeletionPolicy.ThreadAnonymousInnerClassHelper2.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs:line 414
   at J2N.Threading.ThreadJob.SafeRun(ThreadStart start)
   at J2N.Threading.ThreadJob.<.ctor>b__4_0()
   at System.Threading.ThreadHelper.ThreadStart_Context(Object state)
   at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)
   at System.Threading.ThreadHelper.ThreadStart()

   --- End of inner exception stack trace ---
   at Lucene.Net.Index.TestSnapshotDeletionPolicy.ThreadAnonymousInnerClassHelper2.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs:line 419
   at J2N.Threading.ThreadJob.SafeRun(ThreadStart start)

  ----> Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException : Method Debug.Assert failed with 'DWPT must never be null here since we hold the lock and it holds documents
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
Result StandardOutput:	
Culture: ru-KZ
Time Zone: (UTC+04:00) Astrakhan, Ulyanovsk
Default Codec: CheapBastard (CheapBastardCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=crazy): {}
Nightly: False
Weekly: False
Slow: True
Awaits Fix: False
Directory: random
Verbose: True
Random Multiplier: 1
IFD 281 [11.08.2020 16:13:17; NonParallelWorker]: init: current segments file is ""; deletionPolicy=Lucene.Net.Index.SnapshotDeletionPolicy
IFD 281 [11.08.2020 16:13:17; NonParallelWorker]: now checkpoint "" [0 segments ; isCommit = False]
IFD 281 [11.08.2020 16:13:17; NonParallelWorker]: 0 msec to checkpoint
IW 281 [11.08.2020 16:13:17; NonParallelWorker]: init: create=True
IW 281 [11.08.2020 16:13:17; NonParallelWorker]: 
dir=MockDirectoryWrapper(RAMDirectory@47038e lockFactory=Lucene.Net.Store.SingleInstanceLockFactory)
index=
version=4.8.0
matchVersion=LUCENE_48
analyzer=MockAnalyzer
ramBufferSizeMB=16
maxBufferedDocs=-1
maxBufferedDeleteTerms=-1
mergedSegmentWarmer=
readerTermsIndexDivisor=3
termIndexInterval=32
delPolicy=SnapshotDeletionPolicy
commit=null
openMode=CREATE_OR_APPEND
similarity=RandomSimilarityProvider
mergeScheduler=ConcurrentMergeScheduler: maxThreadCount=1, maxMergeCount=2, mergeThreadPriority=-1
default WRITE_LOCK_TIMEOUT=1000
writeLockTimeout=1000
codec=CheapBastard
infoStream=ThreadNameFixingPrintStreamInfoStream
mergePolicy=[TieredMergePolicy: maxMergeAtOnce=17, maxMergeAtOnceExplicit=11, maxMergedSegmentMB=27,6845703125, floorSegmentMB=0,9677734375, forceMergeDeletesPctAllowed=28,449725810647813, segmentsPerTier=15, maxCFSSegmentSizeMB=8796093022207,999, noCFSRatio=1
indexerThreadPool=Lucene.Net.Index.ThreadAffinityDocumentsWriterThreadPool
readerPooling=True
perThreadHardLimitMB=1945
useCompoundFile=True
checkIntegrityAtMerge=False

IW 281 [11.08.2020 16:13:17; t0]: commit: start
IW 281 [11.08.2020 16:13:17; t0]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t0]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t0]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t0]:   index before flush 
DW 281 [11.08.2020 16:13:17; t0]: startFullFlush
DW 281 [11.08.2020 16:13:17; t0]: anyChanges? numDocsInRam=1 deletes=False hasTickets:False pendingChangesInFullFlush: False
DWFC 281 [11.08.2020 16:13:17; t0]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_0, aborting=False, numDocsInRAM=1, deleteQueue=DWDQ: [ generation: 0 ]]
DWPT 281 [11.08.2020 16:13:17; t0]: flush postings as segment _0 numDocs=1
DWPT 281 [11.08.2020 16:13:17; t0]: new segment has 0 deleted docs
DWPT 281 [11.08.2020 16:13:17; t0]: new segment has no vectors; no norms; no docValues; no prox; no freqs
DWPT 281 [11.08.2020 16:13:17; t0]: flushedFiles=[_0.fdt, _0.fdx, _0.doc, _0.tim, _0.tip, _0.fnm]
DWPT 281 [11.08.2020 16:13:17; t0]: flushed codec=CheapBastard
DWPT 281 [11.08.2020 16:13:17; t0]: flushed: segment=_0 ramUsed=0 MB newFlushedSize(includes docstores)=0.00033473968505859375 MB docs/MB=2987.3960113960115
IW 281 [11.08.2020 16:13:17; t0]: create compound file _0.cfs
DW 281 [11.08.2020 16:13:17; t0]: publishFlushedSegment seg-private updates=
IW 281 [11.08.2020 16:13:17; t0]: publishFlushedSegment
IW 281 [11.08.2020 16:13:17; t0]: publish sets newSegment delGen=1 seg=_0(4.8):c1
IFD 281 [11.08.2020 16:13:17; t0]: now checkpoint "_0(4.8):c1" [1 segments ; isCommit = False]
IFD 281 [11.08.2020 16:13:17; t0]: 0 msec to checkpoint
IFD 281 [11.08.2020 16:13:17; t0]: delete new file "_0.fdt"
IFD 281 [11.08.2020 16:13:17; t0]: delete "_0.fdt"
IFD 281 [11.08.2020 16:13:17; t0]: delete new file "_0.fdx"
IFD 281 [11.08.2020 16:13:17; t0]: delete "_0.fdx"
IFD 281 [11.08.2020 16:13:17; t0]: delete new file "_0.doc"
IFD 281 [11.08.2020 16:13:17; t0]: delete "_0.doc"
IFD 281 [11.08.2020 16:13:17; t0]: delete new file "_0.tim"
IFD 281 [11.08.2020 16:13:17; t0]: delete "_0.tim"
IW 281 [11.08.2020 16:13:17; t1]: commit: start
IFD 281 [11.08.2020 16:13:17; t0]: delete new file "_0.tip"
IFD 281 [11.08.2020 16:13:17; t0]: delete "_0.tip"
IFD 281 [11.08.2020 16:13:17; t0]: delete new file "_0.fnm"
IFD 281 [11.08.2020 16:13:17; t0]: delete "_0.fnm"
IW 281 [11.08.2020 16:13:17; t0]: apply all deletes during flush
BD 281 [11.08.2020 16:13:17; t0]: applyDeletes: no deletes; skipping
BD 281 [11.08.2020 16:13:17; t0]: prune sis=Lucene.Net.Index.SegmentInfos minGen=1 packetCount=0
DW 281 [11.08.2020 16:13:17; t0]: t0 finishFullFlush success=True
TMP 281 [11.08.2020 16:13:17; t0]: findMerges: 1 segments
TMP 281 [11.08.2020 16:13:17; t0]:   seg=_0(4.8):c1 size=0,000 MB [floored]
TMP 281 [11.08.2020 16:13:17; t0]:   allowedSegmentCount=1 vs count=1 (eligible count=1) tooBigCount=0
CMS 281 [11.08.2020 16:13:17; t0]: now merge
CMS 281 [11.08.2020 16:13:17; t0]:   index: _0(4.8):c1
CMS 281 [11.08.2020 16:13:17; t0]:   no more merges pending; now return
IW 281 [11.08.2020 16:13:17; t0]: StartCommit(): start
IW 281 [11.08.2020 16:13:17; t0]: startCommit index=_0(4.8):c1 changeCount=4
IW 281 [11.08.2020 16:13:17; t0]: done all syncs: [_0.cfs, _0.cfe, _0.si]
IW 281 [11.08.2020 16:13:17; t0]: commit: pendingCommit != null
IW 281 [11.08.2020 16:13:17; t0]: commit: wrote segments file "segments_1"
IFD 281 [11.08.2020 16:13:17; t0]: now checkpoint "_0(4.8):c1" [1 segments ; isCommit = True]
IW 281 [11.08.2020 16:13:17; t2]: commit: start
IFD 281 [11.08.2020 16:13:17; t0]: 0 msec to checkpoint
IW 281 [11.08.2020 16:13:17; t0]: commit: done
IW 281 [11.08.2020 16:13:17; t1]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t1]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t1]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t1]:   index before flush _0(4.8):c1
DW 281 [11.08.2020 16:13:17; t1]: startFullFlush
DW 281 [11.08.2020 16:13:17; t1]: anyChanges? numDocsInRam=2 deletes=False hasTickets:False pendingChangesInFullFlush: False
DWFC 281 [11.08.2020 16:13:17; t1]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_1, aborting=False, numDocsInRAM=2, deleteQueue=DWDQ: [ generation: 1 ]]
DWPT 281 [11.08.2020 16:13:17; t1]: flush postings as segment _1 numDocs=2
IW 281 [11.08.2020 16:13:17; t3]: commit: start
DWPT 281 [11.08.2020 16:13:17; t1]: new segment has 0 deleted docs
DWPT 281 [11.08.2020 16:13:17; t1]: new segment has no vectors; no norms; no docValues; no prox; no freqs
DWPT 281 [11.08.2020 16:13:17; t1]: flushedFiles=[_1.fdt, _1.fdx, _1.doc, _1.tim, _1.tip, _1.fnm]
DWPT 281 [11.08.2020 16:13:17; t1]: flushed codec=CheapBastard
DWPT 281 [11.08.2020 16:13:17; t1]: flushed: segment=_1 ramUsed=0 MB newFlushedSize(includes docstores)=0.00034332275390625 MB docs/MB=5825.422222222222
IW 281 [11.08.2020 16:13:17; t1]: create compound file _1.cfs
IW 281 [11.08.2020 16:13:17; t4]: commit: start
DW 281 [11.08.2020 16:13:17; t1]: publishFlushedSegment seg-private updates=
IW 281 [11.08.2020 16:13:17; t1]: publishFlushedSegment
IW 281 [11.08.2020 16:13:17; t1]: publish sets newSegment delGen=3 seg=_1(4.8):c2
IFD 281 [11.08.2020 16:13:17; t1]: now checkpoint "_0(4.8):c1 _1(4.8):c2" [2 segments ; isCommit = False]
IFD 281 [11.08.2020 16:13:17; t1]: 0 msec to checkpoint
IFD 281 [11.08.2020 16:13:17; t1]: delete new file "_1.fdt"
IW 281 [11.08.2020 16:13:17; t5]: commit: start
IFD 281 [11.08.2020 16:13:17; t1]: delete "_1.fdt"
IFD 281 [11.08.2020 16:13:17; t1]: delete new file "_1.fdx"
IFD 281 [11.08.2020 16:13:17; t1]: delete "_1.fdx"
IFD 281 [11.08.2020 16:13:17; t1]: delete new file "_1.doc"
IFD 281 [11.08.2020 16:13:17; t1]: delete "_1.doc"
IFD 281 [11.08.2020 16:13:17; t1]: delete new file "_1.tim"
IFD 281 [11.08.2020 16:13:17; t1]: delete "_1.tim"
IFD 281 [11.08.2020 16:13:17; t1]: delete new file "_1.tip"
IFD 281 [11.08.2020 16:13:17; t1]: delete "_1.tip"
IFD 281 [11.08.2020 16:13:17; t1]: delete new file "_1.fnm"
IFD 281 [11.08.2020 16:13:17; t1]: delete "_1.fnm"
IW 281 [11.08.2020 16:13:17; t1]: apply all deletes during flush
BD 281 [11.08.2020 16:13:17; t1]: applyDeletes: no deletes; skipping
BD 281 [11.08.2020 16:13:17; t1]: prune sis=Lucene.Net.Index.SegmentInfos minGen=1 packetCount=0
DW 281 [11.08.2020 16:13:17; t1]: t1 finishFullFlush success=True
TMP 281 [11.08.2020 16:13:17; t1]: findMerges: 2 segments
TMP 281 [11.08.2020 16:13:17; t1]:   seg=_1(4.8):c2 size=0,000 MB [floored]
TMP 281 [11.08.2020 16:13:17; t1]:   seg=_0(4.8):c1 size=0,000 MB [floored]
TMP 281 [11.08.2020 16:13:17; t1]:   allowedSegmentCount=1 vs count=2 (eligible count=2) tooBigCount=0
CMS 281 [11.08.2020 16:13:17; t1]: now merge
CMS 281 [11.08.2020 16:13:17; t1]:   index: _0(4.8):c1 _1(4.8):c2
CMS 281 [11.08.2020 16:13:17; t1]:   no more merges pending; now return
IW 281 [11.08.2020 16:13:17; t1]: StartCommit(): start
IW 281 [11.08.2020 16:13:17; t1]: startCommit index=_0(4.8):c1 _1(4.8):c2 changeCount=6
IW 281 [11.08.2020 16:13:17; t1]: done all syncs: [_0.cfs, _0.cfe, _0.si, _1.cfs, _1.cfe, _1.si]
IW 281 [11.08.2020 16:13:17; t1]: commit: pendingCommit != null
IW 281 [11.08.2020 16:13:17; t6]: commit: start
IW 281 [11.08.2020 16:13:17; t1]: commit: wrote segments file "segments_2"
IFD 281 [11.08.2020 16:13:17; t1]: now checkpoint "_0(4.8):c1 _1(4.8):c2" [2 segments ; isCommit = True]
IFD 281 [11.08.2020 16:13:17; t1]: 0 msec to checkpoint
IW 281 [11.08.2020 16:13:17; t1]: commit: done
IW 281 [11.08.2020 16:13:17; t2]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t2]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t2]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t2]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t2]: startFullFlush
DW 281 [11.08.2020 16:13:17; t2]: anyChanges? numDocsInRam=4 deletes=False hasTickets:False pendingChangesInFullFlush: False
DWFC 281 [11.08.2020 16:13:17; t2]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_2, aborting=False, numDocsInRAM=4, deleteQueue=DWDQ: [ generation: 2 ]]
IW 281 [11.08.2020 16:13:17; t2]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t2]: t2 finishFullFlush success=False
DWFC 281 [11.08.2020 16:13:17; t7]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_2, aborting=False, numDocsInRAM=4, deleteQueue=DWDQ: [ generation: 2 ]]
IW 281 [11.08.2020 16:13:17; t7]: commit: start
IW 281 [11.08.2020 16:13:17; t9]: commit: start
IW 281 [11.08.2020 16:13:17; t8]: commit: start
IW 281 [11.08.2020 16:13:17; t3]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t3]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t3]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t3]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t3]: startFullFlush
DW 281 [11.08.2020 16:13:17; t3]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t3]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t3]: t3 finishFullFlush success=False
IW 281 [11.08.2020 16:13:17; t4]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t4]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t4]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t4]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t4]: startFullFlush
DW 281 [11.08.2020 16:13:17; t4]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t4]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t4]: t4 finishFullFlush success=False
IW 281 [11.08.2020 16:13:17; t5]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t5]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t5]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t5]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t5]: startFullFlush
DW 281 [11.08.2020 16:13:17; t5]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t5]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t5]: t5 finishFullFlush success=False
IW 281 [11.08.2020 16:13:17; t6]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t6]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t6]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t6]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t6]: startFullFlush
DW 281 [11.08.2020 16:13:17; t6]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t6]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t6]: t6 finishFullFlush success=False
IW 281 [11.08.2020 16:13:17; t9]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t9]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t9]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t9]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t9]: startFullFlush
DW 281 [11.08.2020 16:13:17; t9]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t9]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t9]: t9 finishFullFlush success=False
IW 281 [11.08.2020 16:13:17; t8]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t8]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t8]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t8]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t8]: startFullFlush
DW 281 [11.08.2020 16:13:17; t8]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t8]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t8]: t8 finishFullFlush success=False
IW 281 [11.08.2020 16:13:17; t7]: commit: enter lock
IW 281 [11.08.2020 16:13:17; t7]: commit: now prepare
IW 281 [11.08.2020 16:13:17; t7]: prepareCommit: flush
IW 281 [11.08.2020 16:13:17; t7]:   index before flush _0(4.8):c1 _1(4.8):c2
DW 281 [11.08.2020 16:13:17; t7]: startFullFlush
DW 281 [11.08.2020 16:13:17; t7]: anyChanges? numDocsInRam=7 deletes=False hasTickets:False pendingChangesInFullFlush: False
IW 281 [11.08.2020 16:13:17; t7]: hit exception during prepareCommit
DW 281 [11.08.2020 16:13:17; t7]: t7 finishFullFlush success=False

@NightOwl888
Copy link
Contributor Author

Thanks for the info. I haven't been running the tests in debug mode, and as I mentioned here, there are a suite of tests that are enabled in debug mode we have been skipping. Once we fix the ability to "turn on assert" in release mode, there will probably be some problems to address that we haven't been aware of because all of the assert "tests" are currently being skipped.

Of course, it would be best to fix it that way. Currently, both the "Verbose" setting and the asserts are turned on when running in debug mode. Verbose makes it painfully slow. It would be better if we could turn each of them on and off independently.

Not that this problem isn't concerning, but I suspect that it isn't a new problem that was introduced by this PR, which can be confirmed by running the master branch in debug mode. We should probably also check out each tag to determine which release this first appeared in, as I am sure that earlier betas did not throw this exception. When we released 4.8.0-beta00004, the tests were run in debug mode - we switched to running them in release mode at a later point (for sure by 4.8.0-beta00007). IIRC, the primary reason for the switch was because Microsoft changed the behavior of Debug.Assert to crash the test framework when an assert fails instead of outputting any useful information and continuing. But, we also set it up to test the very same binaries that were being released on Azure DevOps, which are always compiled in release mode.

That being said, if we can confirm this PR isn't introducing any new problems (other than the threading contention and ICU test failures, which are also on our radar to fix), we can merge it.

@NightOwl888
Copy link
Contributor Author

NightOwl888 commented Aug 11, 2020

I confirmed that the error exists in the master branch, and I got it to occur on beta00006 on the second try with a [Repeat(5000)] attribute. It does not occur in beta00005 running 100,000 times.

I also noticed that if the repeat attribute is changed to 10,000 times it will cause an OutOfMemoryException.

I also saw this error occur from the same test:


Test Name:	TestMultiThreadedSnapshotting
Test FullName:	Lucene.Net.Tests._I-J.Lucene.Net.Index.TestSnapshotDeletionPolicy.TestMultiThreadedSnapshotting
Test Source:	F:\Projects\lucenenet\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs : line 353
Test Outcome:	Failed
Test Duration:	0:00:00

Test Name:	TestMultiThreadedSnapshotting
Test Outcome:	Failed
Result StackTrace:	
at J2N.Threading.ThreadJob.Join()
   at Lucene.Net.Index.TestSnapshotDeletionPolicy.TestMultiThreadedSnapshotting() in F:\Projects\lucenenet\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs:line 375
--DebugAssertException
Result Message:	
System.Exception : Method <method> failed with 'full flush buffer should be empty: System.Collections.Generic.List`1[Lucene.Net.Index.DocumentsWriterPerThread]
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
Data:
  OriginalMessage: System.Exception: Method <method> failed with 'full flush buffer should be empty: System.Collections.Generic.List`1[Lucene.Net.Index.DocumentsWriterPerThread]
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
 ---> Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException: Method <method> failed with 'full flush buffer should be empty: System.Collections.Generic.List`1[Lucene.Net.Index.DocumentsWriterPerThread]
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.

   --- End of inner exception stack trace ---
   at Lucene.Net.Index.TestSnapshotDeletionPolicy.ThreadAnonymousInnerClassHelper2.Run() in F:\Projects\lucenenet\src\Lucene.Net.Tests\Index\TestSnapshotDeletionPolicy.cs:line 420
   at J2N.Threading.ThreadJob.SafeRun(ThreadStart start)

  ----> Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException : Method <method> failed with 'full flush buffer should be empty: System.Collections.Generic.List`1[Lucene.Net.Index.DocumentsWriterPerThread]
', and was translated to Microsoft.VisualStudio.TestPlatform.TestHost.DebugAssertException to avoid terminating the process hosting the test.
Result StandardOutput:	
Culture: ru-KZ
Time Zone: (UTC+04:00) Astrakhan, Ulyanovsk
Default Codec: CheapBastard (CheapBastardCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=crazy): {}
Nightly: False
Weekly: False
Slow: True
Awaits Fix: False
Directory: random
Verbose: True
Random Multiplier: 1
IFD 1 [11.08.2020 18:10:18; NonParallelWorker]: init: current segments file is ""; deletionPolicy=Lucene.Net.Index.SnapshotDeletionPolicy
IFD 1 [11.08.2020 18:10:18; NonParallelWorker]: now checkpoint "" [0 segments ; isCommit = False]
IFD 1 [11.08.2020 18:10:18; NonParallelWorker]: 1 msec to checkpoint
IW 1 [11.08.2020 18:10:18; NonParallelWorker]: init: create=True
IW 1 [11.08.2020 18:10:18; NonParallelWorker]: 
dir=MockDirectoryWrapper(RAMDirectory@548c1d lockFactory=Lucene.Net.Store.SingleInstanceLockFactory)
index=
version=4.8.0
matchVersion=LUCENE_48
analyzer=MockAnalyzer
ramBufferSizeMB=16
maxBufferedDocs=493
maxBufferedDeleteTerms=-1
mergedSegmentWarmer=
readerTermsIndexDivisor=2
termIndexInterval=58
delPolicy=SnapshotDeletionPolicy
commit=null
openMode=CREATE_OR_APPEND
similarity=RandomSimilarityProvider
mergeScheduler=ConcurrentMergeScheduler: maxThreadCount=1, maxMergeCount=2, mergeThreadPriority=-1
default WRITE_LOCK_TIMEOUT=1000
writeLockTimeout=1000
codec=CheapBastard
infoStream=ThreadNameFixingPrintStreamInfoStream
mergePolicy=[LogDocMergePolicy: minMergeSize=1000, mergeFactor=13, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=True, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8796093022207,999, noCFSRatio=1]
indexerThreadPool=Lucene.Net.Index.ThreadAffinityDocumentsWriterThreadPool
readerPooling=True
perThreadHardLimitMB=1945
useCompoundFile=True
checkIntegrityAtMerge=False

IW 1 [11.08.2020 18:10:18; t6]: commit: start
IW 1 [11.08.2020 18:10:18; t5]: commit: start
IW 1 [11.08.2020 18:10:18; t2]: commit: start
IW 1 [11.08.2020 18:10:18; t6]: commit: enter lock
IW 1 [11.08.2020 18:10:18; t1]: commit: start
IW 1 [11.08.2020 18:10:18; t3]: commit: start
IW 1 [11.08.2020 18:10:18; t6]: commit: now prepare
IW 1 [11.08.2020 18:10:18; t9]: commit: start
IW 1 [11.08.2020 18:10:18; t7]: commit: start
IW 1 [11.08.2020 18:10:18; t8]: commit: start
IW 1 [11.08.2020 18:10:18; t0]: commit: start
IW 1 [11.08.2020 18:10:18; t4]: commit: start
IW 1 [11.08.2020 18:10:18; t6]: prepareCommit: flush
IW 1 [11.08.2020 18:10:18; t6]:   index before flush 
DW 1 [11.08.2020 18:10:18; t6]: startFullFlush
DW 1 [11.08.2020 18:10:18; t6]: anyChanges? numDocsInRam=10 deletes=False hasTickets:False pendingChangesInFullFlush: False
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_0, aborting=False, numDocsInRAM=2, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_1, aborting=False, numDocsInRAM=2, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_6, aborting=False, numDocsInRAM=1, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_2, aborting=False, numDocsInRAM=1, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_3, aborting=False, numDocsInRAM=1, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_4, aborting=False, numDocsInRAM=1, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState DocumentsWriterPerThread [pendingDeletes=gen=0, segment=_5, aborting=False, numDocsInRAM=1, deleteQueue=DWDQ: [ generation: 0 ]]
DWFC 1 [11.08.2020 18:10:18; t6]: addFlushableState Docume< Truncated >



I have opened #326 to put a higher priority on this task - we need to get the "asserts" working in Release mode to ensure that our test suite is testing everything that it is supposed to so more issues like this don't creep in without us being aware of them.

… exception when using OpenBitSet.FastGet() instead of OpenBitSet.Get(), since the size of the bit set is unknown.
@Shazwazza
Copy link
Contributor

Yes of course I forgot about the release vs debug. I've just compiled and re-ran all tests locally in Release and I get 2 other failing tests. I'll keep re-running them to see what happens.

TestThaiAnalyzer

TestRandomHugeStrings

Test Name:	TestRandomHugeStrings
Test Outcome:	Failed
Result Message:	
Multiple failures or warnings in test:
  1) Expected: 𢌑, Actual: 𢌑𧥣𧡐

term 79, output[i] = 𢌑, termAtt = 𢌑𧥣𧡐
  2) Thread threw exception: NUnit.Framework.AssertionException: Expected: 𢌑, Actual: 𢌑𧥣𧡐

term 79, output[i] = 𢌑, termAtt = 𢌑𧥣𧡐
   at NUnit.Framework.Assert.ReportFailure(String message)
   at NUnit.Framework.Assert.Fail(String message, Object[] args)
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.AssertTokenStreamContents(TokenStream ts, String[] output, Int32[] startOffsets, Int32[] endOffsets, String[] types, Int32[] posIncrements, Int32[] posLengths, Nullable`1 finalOffset, Nullable`1 finalPosInc, Boolean[] keywordAtts, Boolean offsetsAreCorrect, Byte[][] payloads) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 147
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.CheckAnalysisConsistency(Random random, Analyzer a, Boolean useCharFilter, String text, Boolean offsetsAreCorrect, Field field) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 1243
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.CheckRandomData(Random random, Analyzer a, Int32 iterations, Int32 maxWordLength, Boolean useCharFilter, Boolean simple, Boolean offsetsAreCorrect, RandomIndexWriter iw) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 928
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.AnalysisThread.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 716
Result StandardOutput:	
Culture: ru-KZ
Time Zone: (UTC+04:00) Astrakhan, Ulyanovsk
Default Codec: CheapBastard (CheapBastardCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=crazy): {}
Nightly: False
Weekly: False
Slow: True
Awaits Fix: False
Directory: random
Verbose: False
Random Multiplier: 1
Result StandardError:	TEST FAIL: useCharFilter=True text='rmoirou \ud808\udd1a\ud808\ude88\ud808\udc0c\ud808\udc99 \u2079\u208a\u2078\u2083\u209b\u2089\u209e\u209b OpbRmje \u2040\u2039\u2068\u2028\u2021\u2058\u2000 vewlawvj xmychadgx \ue4f0\u3027\u075d\ued92\u5a89\ud809\udf8fW \ud800\udd6f\ud800\udd57\ud800\udd7f\ud800\udd83 wwjp[ofa okgaPVG \u1ddd\u1dfe\u1dc4\u1dfa\u1dd8\u1df7\u1df6 \ud834\udf58\ud834\udf4a\ud834\udf1b\ud834\udf0c qwloruoa kkltpvazd \ud455\ud9c3\udc07#\ud8ee\udd14\u9522\ud9a0\udddf 1487093 \u1dfd\u1ddc\u1de1\u1dd5\u1ded\u1df3\u1dde \"dmez\n&# sorektwg \ud802\ude11\ud802\ude03\ud802\ude1d\ud802\ude24 \ued55\u06c2\uecbe\ue74a\u0122\ufa11\uec79\uc74a\ue489 yaajotppi \ueaf4\ud95c\uddfa\u0438\ud9d7\udea7\uf743 \ua918\ua915\ua929\ua909\ua908\ua916\ua906\ua903 ntkytorv \udb07\ude55\u8b30\u0743\u01d0\u3385\u000e pVdyRaaQm dignitnz luymwyrou <??><scri bwepgcu \u0539\u0584\u0559\u0538\u055b\u053a\u057d\u0543\u0547 mcjfvjm \u2745\u2708\u277e\u273f\u270c\u2716\u279b \uaa72\uaa67\uaa75\uaa7d\uaa7a\uaa64\uaa6b\uaa68\uaa68 \u0d8f\u0dcb\u0dc4\u0dcb\u0da8\u0db3\u0dcd lbdayvgd \"pdq300 hktjwdi dxiqscwg ciujlwr ylaizewky tormunoc \u2ae9\u2a91\u2a5c\u2a96\u2a36\u2a5d\u2abb\u2a5c\u2ac7 \u07dc\u07e9\u07fe\u07c2\u07f8\u07d5\u07d9 \uf378\u0414F\u028a\u0003m\udae6\ude6e \u3114\u3104\u3115\u3115\u312d\u3126\u3115 sasvdqqv \u221a\u229e\u227a\u2283\u226b\u2201\u2289\u2288\u221a vuqpcmx v).(p{1,5 8647639 )\ud638\ud93e\udce8\u10e8\ufc9b\ue030g \u0516\u0506\u0516\u051e\u051e\u052e\u050c aqtjdfar bjbcrnoet yrfveks uikpqsp dmpskdygo zfggtvc \u2148\u2122\u212f\u212f\u2140\u2132\u2102\u211f\u2109 xlnbxgfw \u2eb9\u2edf\u2e89\u2eb6\u2eee\u2ee5\u2ee0 \u1619\u15eb\u159c\u1493\u1577\u15c5\u15b6\u15e0\u165f \ud85b\udd64\ud848\udf11\ud85e\udd63\ud85e\udc50 ndswolowm \u0b41\u0b09\u0b73\u0b11\u0b45\u0b71\u0b7e cuutfob mleqclxfr \u878e\ue4ec\uf2ae$\u84e5e\uf0f5&= \u2d7d\u2d78\u2d6f\u2d5c\u2d3d\u2d36\u2d59\u2d5e\u2d49 gnlvk?{0, dznvsio rqebnglv nrvscgsec \u24af\u24e5\u247b\u24c8\u249a\u24b8\u2465 \u319a\u3195\u3192\u319a\u3198\u3193\u3198 h|y)|m| ybehrjjcb hmniijop kehjhwlk vsltivma clialaw lhrksdYj </script \udaf3\udddfr\u00f8\ud8d3\udf3b\u057a\uff7e\uee88 actpozjps \ufe06\ufe0a\ufe05\ufe0a\ufe0d\ufe0c\ufe0c\ufe0e\ufe0b \u3013\u3021\u3029\u3034\u300a\u3022\u3034 gyfcgqi \u29e8\u7f3e\u1824\ubb99\u0209\ue40d\u001eJ\u067a njpaiduk \u87c7\ud9d4\udc93a\u6573\ue5db\ufbb5 vstyhiz ugkvvgjm \u0fb4\u0003\u0ea4\ud812\udc90\u9228\u04b3\ufc61 Hnrhhtbf hwlrolay nuahyrlfr \u01aa\u01fd\u0184\u01d5\u0209\u023d\u01a6\u0205 ayoiyujin jvdpqddc mxlcomhs evtwxxu \ud800\uddff\ud800\uddd3\ud800\uddd5\ud800\uddeb \uaa65\uaa71\uaa7c\uaa65\uaa75\uaa6c\uaa76\uaa63 hochnclbf \u0006\u02f5\ucf32\u02e6$W\u0418 hjelclwu \u0009\udba0\udf29\u8c73\u8d23\ufb95\u0190\uf6bc \u01c9\u0242\u022d\u0219\u01ea\u0181\u018b\u01d9\u0188 {0,5}{1 zcvwdetp tgvrjvlg \"\\\"&ibs jxslhxgwu hqmxluyd qqugmva +\udafd\udf91\u0014\u0498\u5fda\ue000\u07b2 \u0dc3\u0dde\u0ddf\u0dfc\u0d8d\u0da6\u0d9f\u0da4 jeluicdd \ud802\udd02\ud802\udd00\ud802\udd06\ud802\udd1b L\uda18\udf9a\u06c5\ud8b0\udeda=gU kfwialw \ue900\udb2c\udf97nIv\u891cA fucubvrdg ukbxqzec </sCRIP ){0,5}x ihecpbi \u1748\u175b\u1740\u1747\u174a\u1749\u1756 \u03c7\u03f5\u0393\u03e5\u0399\u03c7\u03d6\u037f xcfvdnuez \u279a\u2774\u271b\u2761\u2782\u2799\u2765 (s{0,5}m \uf900\u80ab\u03e9\u05ac\u8170\ued25\u1c76\ue337 fhbryuxb xdaorabo jwtphtn \ud802\udf75\ud802\udf61\ud802\udf6e\ud802\udf73 emtmjqvbn mzrywmne mqivodbsg i\udb27\udcb2\uf927h:\uf23c\uf40b\u001c mkocfyo \u9197q\u04fb\u000bN\uf49c\u1e96 vbraceuwn dtkwxhuml \u2fb4\u2f27\u2f22\u2f93\u2f75\u2fc2\u2f27\u2f00 zcuwxuxqm \u0011\u2492\u827a\ud882\udc1dF\u1729\uf055\u5583 nj\u05dd\u0004\uec7eW\u9a8c\u000b\u6382 \n\"</scr Ek\udaea\udcb1\udbba\udc48+ \ua22e\ua3ce\ua0d2\ua336\ua347\ua1b4\ua0cf\ua380 -xm-[)]-x lypfcott \u2cda\u2c8b\u2cd8\u2c98\u2c95\u2c8e\u2ccf\u2cc7 gG0\u065e\u01f3\ue133\n bffxuwf \ud834\udf35\ud834\udf2f\ud834\udf49 slfswjdwr vdeylfnlg 6@;I\u009fx\ue55a\u03c9 fskagwq \u23f0\u2334\u23e8\u233a\u234e\u239d\u2370\u230b --><?>< \u1db3\u1d98\u1d98\u1d9d\u1dbb\u1da4\u1d81\u1db9 wwdnamhh |{0,5}. lbdcogw sptrttla fjbtbwldf \u3357\u33b2\u33b6\u33e3\u334c\u33b5\u331e\u3341\u3366 x]i-)cg{ zsbucou qrosvse \u0004\u0257I`\uaf37\ufe46\ufd19\ufc48X thgpjsxxf \ud800\udf0c\ud800\udf12\ud800\udf02\ud800\udf07 \u0c87\u0caa\u0cc5\u0caa\u0cca\u0ca0\u0cf3\u0cd0 puoonjjj \ufeb4\u06b0^\u82eb\u07d7\u073e\u5e92\u07b2 azjmwas anznlfu \ua978\ua969\ua96f\ua97b\ua96d\ua974\ua972 \ud982\ude8f\udb3e\udc51\u5f48L#\ufa35 xxmhkqeo \n<?</   vzaafeks jscsanj PqYDdUba ><  ilpn     </scri serjmhzze >\\'&#x'< sseadyg \u1a6c\u1a6c\u1a64\u1aae\u1a3b\u1a80\u1a38\u1a81 \u00ec\uec6a\u001b\ufa27\u05a7\ub238\ufb24 \u175c\u1742\u175b\u1759\u174d\u1741\u1757 \uec7a\uda6f\udd0e\ufa81\ufe3b\ue58ei qmjoynxl iuhbxrzf otrxvlx wqcbkpi sbibimzk \ufb21\u8bd1\ue379\u0169\ue0db\u0011\ud9a2\udcdd zvuvbvcil l4\ue3d5\uda78\ude99\uefe6\ub1ad \ud809\udc3f\ud809\udc3b\ud809\udc33\ud809\udc6a fsaovmazv dhijpniru jwlwgrf </script <!--#</s uaaggsana bpbqcifax okauyoc \u0580\u74ae\u0865\uf01b\u0169\ufd0d\ua2fc\uebb4\u01d6 lwiybeep bnjryrlvo pcpooyvi \ud802\udc4e\ud802\udc55\ud802\udc59\ud802\udc43 \ua709\ua708\ua716\ua709\ua711\ua717\ua70f\ua708 </   vkvf zlsymyy nfecdocw njonsmqxv \u302d\u3022\u300f\u3008\u3013\u3023\u303e\u3020\u3039 \uf64c\ud80a\ude17\u65dfh\ud9cc\udd32 \u2324\u230f\u23d8\u2359\u23ce\u23d8\u234d i0\uecdf\u5532\u3235\ufd20\uf094\ub6aa nifwpewv \uffb6\u1767\u034b\u02d3\ue53f\ue426 \u3205\u322d\u321b\u326d\u32e6\u32db\u32b8\u325c *\udbcb\ude20C\u0010\u0019\u0342\u8817\uf136 <Br>&#</s vpdfytj E\u834e\udb8d\udf11\uf955\u02d4\u067f ufziceoei ylckoelia oddhlvoqn <bR-->?> \uaa77\uaa6d\uaa73\uaa7f\uaa76\uaa6e\uaa76\uaa79 bivdqxu ><!--<scr meopvvpn cvweclmh dqaxzvqu &#<     pbcipst \u1ce9\u1ce8\u1cd0\u1cf3\u1cdd\u1ce8\u1cf7\u1cd9\u1cd0 \u31f4\u31f4\u31fb\u31fd\u31f1\u31ff\u31f6\u31f2\u31f1 zxkhlbylx \u07e4\u07c1\u07c1\u07d9\u07dd\u07eb\u07ff\u07e0\u07f3 yyghomsb {1,5}e{0, </p>>\\\"< vveklwysz vuxkwrss q[{0,5}{ xkhstiv <script \u17af\u179b\u17ea\u17a0\u17aa\u17d2\u17d0\u17f4 dswmzexh eXxbgfdt mztjmskr kbwayivtt \ua4b4\ud8fb\udc2a\ue961\uf6a9\u0331\u3306 \ufe54\ufe54\ufe50\ufe63\ufe59\ufe5e\ufe66\ufe5b oppvqpqk xuhgeua [|eo.cw \u072b\u07c0\u001c\uefd5\u0013\u7583\u001e \udbb9\udd6e\u8f3d\u93a5\u238ff\ucf12 irdbfznbm \u03a5\u03d9\u03da\u03d7\u0397\u03e8\u03b8\u0394\u039e tzypkxru \ub477\ue920\uaf8a\u8246\ufbf4\u049b\uec8c {\ud93c\udf15\uc959\udb45\ude8d\u4047 zsydygyzh \ufad4\ufab8\uf91c\ufa9e\ufa9b\ufa61\ufaa7\ufa13 \ua6b6\ua6bc\ua6ca\ua6c3\ua6de\ua6b0\ua6cb syjcqiawr psprfwqs itxpuomi r[)|s.|nv alnnqdy \u2c7c\u2c69\u2c7f\u2c68\u2c60\u2c74\u2c6e \u0007\u00d8\u111d\u02ca\ue00b\ucb33L cjgvndvgt naxjzld \ud802\ude1e\ud802\ude14\ud802\ude0a\ud802\ude2f ifzhhsi pd?])dc \uabf3\uabfa\uabea\uabf8\uabe9\uabd9\uabf3\uabda bridupl \u115e\u1197\u110d\u11d6\u11cd\u1181\u1156\u118d \u11b3\u1110\u118e\u112d\u11ab\u11f9\u1119\u1199\u1193 \ud808\udc58\ud808\udc13\ud808\udf89\ud808\udda9 jnxpqwxk ot{0,5}p) \ud802\udd06\ud802\udd16\ud802\udd0e \uf675\ue48f\u0420\ufcba\u0255\u0015 -qtg{1,5 cpdzksooj \u2396\u2366\u2374\u237e\u23ca\u2396\u2358\u23df\u2363 \ua836\ua835\ua836\ua838\ua83b\ua832\ua837\ua831 jezabnsa \uefb1\uece3\u0636\u06bfp\u1a9e dpkrxpmv \ua7bd\ua7dc\ua7da\ua793\ua76a\ua7a0\ua786\ua798\ua7ec \\'</scrip bprkoukk jrawcus \udb9f\udeac\ud858\udd64\u2c53T\uf6a5\u0011\u07c0 \u0409\ud9a5\udf9c\u05cb\u077e\u6f0d\uc8fc\u0681\ufaa0 \u2d1f\u2d2a\u2d0e\u2d0d\u2d07\u2d1a\u2d03\u2d19 rgzdibfvu \u9984\u07e7\ud833\udc9d\u0116\u5431\u3dc6e \u2271\u2296\u22ac\u22f4\u2225\u226a\u2245\u2249 \u5704\ue605\ud96c\udf21\ub004\ue11a\u043b\u0009 yjzkjjmeu \u042f\u01ed\u04e0\ud8ce\udda7\u2c4e3 axvvopdzb \uda39\udea9'\uf100k\u036a\u0008\u05df mwgbdbtrg dovdrhuym sppagdl \u4a08\udbb5\udcf2\u01e4\u0006\ue605 unbchdl \ud800\udf34\ud800\udf35\ud800\udf42\ud800\udf37 &#372777 \u01d6X\ue5d9\ueb3b\u03af\u03a6\ue647\ue47f phzhyjbut hxwjhgzdl \u3155\u318b\u3186\u318b\u3137\u3168\u3135 ZLHhohJY \ud801\udc15\ud801\udc18\ud801\udc2a \u06a0\ud912\udc54M\ud8ee\ude52\uda45\ude35 \ufe6b\ufe5c\ufe5a\ufe52\ufe63\ufe68\ufe57\ufe65 )l.gt-(.) jfnzuroij ozvcbrydm \u29f3\u299a\u29be\u29f9\u29bb\u298f\u29e5\u29c8 \udaf6\udd6e \u8164\u0282\udb4f\udc38\uea05 vkimckyjd \u2578\u252d\u250f\u2534\u2523\u253d\u2505\u257f xqblmmg mpvhtlu bjmmdvr \u252a\u255a\u2524\u2568\u2524\u253a\u2543 mwuytzozb \u2869\udae6\udd1d2\u01747\ufe52\u493a \\';<?;12 ihitinvb uykdosn betomli xwkpnhet \ud867\udcda\ud84d\ude3f\ud859\udc55\ud84d\udf1e \ua49e\ua4c5\ua493\ua4a1\ua4ad\ua4b3\ua490 \u1636\u1522\u1457\u1595\u15d3\u1606\u142a R\ucb69\uf110\u0007j\u1895\u0211\ufed9 \u18db\u18f1\u18e2\u18e5\u18e1\u18e3\u18f5\u18e5 \ueb63\u01c75o>\u000f\u07d9 ajhxeoji \ua6ca\ua6f5\ua6cf\ua6e1\ua6e1\ua6aa\ua6bc lhvvznt yxozzqk sxxcbvde \uec03\u5f7d\uf159\u0281\uac8dy\u8016\udbcb\udfaf \ucd19\ufa0a\uf4b7\ud86a\udeeb\u51ad\u1f01\ucc53d \u0d3c\u0d0c\u0d1d\u0d11\u0d6c\u0d0c\u0d41\u0d5b \ua135\ua031\ua140\ua46d\ua3cd\ua17f\ua366\ua19c ohtjeha gew{1,5} \ud800\udf98\ud800\udf8b\ud800\udf90\ud800\udf87 \u32b6\u3272\u3282\u320f\u3238\u32c4\u32c3 \udb40\udc66\udb40\udc0a\udb40\udc16 mbtvntaq \uebdb\u00df\uf480E\u47a0z\ubfb2 aoqzljxl \u0556\u0576\u0564\u056c\u056c\u0550\u0581\u0585\u0580 \u89f1`\ua24d\ud878\udf07\u0577B tdojvfu *\ue958\uf18a\u5a57\u001d\uad0f{\u5568 v\n<\u07c7\u0243\u66dc7 m\ud864\udecc\ud946\udcfc\u07f9\uf8c6\u010f \u247c\u24c9\u24c5\u24d1\u24ac\u2476\u247a\u24de }\u070c\uf15a\uebc7\ueab9\u5eed\u468e\u0690 oxpluod cbbtosj \u2dd2\u2d95\u2dcb\u2dd1\u2d8b\u2d98\u2d96\u2db2 uygolnl fhglcic \ud809\udc45\ud809\udc3b\ud809\udc6b\ud809\udc59 ?sr(wi(i? pvphhdcx ilppxjnrj gpjacpkyw gwbzgtqtj fmojstnk zthobwt nwe{1,5}] spqjveifo tnlzczf (\u9dc2\ud9d8\udca7\u0016\ue25d\ue2f9\ud6e3 jajlqgu \uc1d6\u051d\u0997\u065f\u07af\u1731k\u8807 \u4b4b\u0765\u0515\u0011\ub6e7N\ud838\udf54 \u1b69\u1b48\u1b5f\u1b71\u1b34\u1b1f\u1b6f cztqxvgm \u02f7!rm\ua005\ud803\udd72\u074e O\u1747M\uf90f\ueb80\ud8f3\udc70 \u0430\uee9c\uda4a\udcba\u133an\u53fd\u7c42\u61c8 aiibnvf \ufe50\ufe6d\ufe53\ufe57\ufe64\ufe67\ufe69\ufe5b lcqpmaw fohfeysra wzspzwmon mdzmskpui \ua598\ua541\ua607\ua5de\ua543\ua579\ua50f\ua590 XgZdlMe fjuvybfp \u02a7\u0276\u027c\u2c6e\u02a9\u0282\u01b7 qw.r{0,5 \ue261\ue393\ufe93\ud9ef\udc48#j mbcecutxn \ufa6a\u0011\u000f\u07d8\ud9d5\ude8d\ud97a\udecb@ \u048e\ucc6el\uf466\uda1c\ude59\u02d3\ue377 vikyrub 13158513 \ud800\udd3f\ud800\udd32\ud800\udd00\ud800\udd0f jandkrsqp \uda35\udca0\ue7ac\ud830\udc41\udbcb\udf16} \ud83c\ude0e\ud83c\ude5d\ud83c\ude4f\ud83c\ude94 midzjoyq \u119e\uf208\ufea4\uff18\ufa0f5\u058e \uf0c9\u00af\u00df\ud86c\udd2e\udbef\udf7a kflnhwgs \u7482\u048a\u074f\uf819\u067c-I \ud8cf\udc6f\uec13,\u0129\ue52c gypeecxl \u1b5a\u1b2d\u1b7b\u1b30\u1b34\u1b39\u1b7a \u109b\u1074\u1017\u104e\u102d\u105f\u1032\u102e fygbnpotp gkeafjtg UZxWFdvy gpsvexk \u727e\u001ay\uda2d\udd49\udace\udf96 &#--><! chvgvzs k.y)z]|]( bbobpho ekrzvlvz \ue5e2\ue21f\udaa9\udf1e10\u1427\uf5c2 ovyuwqll imycshl \ud8c0\udcff8I\uf2e2\ufaa0\ud818\ude07 \uda3d\udf0bg\u2c5e\ud914\udd98\u001e\u592c \u258f\u2594\u259a\u2581\u2586\u2595\u2590\u2598 qmwquum wsvfruf \u2c73\u2c70\u2c6f\u2c70\u2c72\u2c7e\u2c62 \ud96e\udf16\u17c2\ueacb8\ufea6s \ufbb2\uaccb\u49d6W\u07ca\ud8fc\udf37 ofbqdse ecadnhh ykryxgrtd jqvbsvy \uab13\uf997S\uda06\udd62\udb6d\uddd6\u0359\u6e42 |t{1,5}|j \u034d\u3950\ud9b5\udf12\\\ue980\ubb04 \u210a\u2106\u213d\u2110\u2136\u212c\u2129\u2132\u2102 quihentsa pwssuec \u319d\u3199\u319e\u3198\u319d\u319e\u319c\u3191\u319d \u024b\uab35\ud948\ude3c\uabec\udbcc\udcd2\u060d kmlntxmj kmzduuas \u21a7\u21ca\u21b9\u21be\u21fb\u21ed\u21a1\u21b8 <script> </script \uda7d\ude18\u0423<\u0019\ue774 </script> \ufd2c\u145b\u0588\u057a\uf8e0\u10c9\ufbaa\u2f12 \u0b98\u0bfe\u0bbc\u0be2\u0bee\u0b98\u0bca rtqolvc nwmfulksq sodmqnuyu \ud803\udc28\ud803\udc12\ud803\udc0b\ud803\udc49 \u0708\uebdb\u0183\u0004w\ue0cfhq \ud802\udf37\ud802\udf12\ud802\udf11\ud802\udf29 pcfromfn \u0010\u0009!=<D7C\u0001 \u18be\u18e5\u18fa\u18c1\u18b5\u18e8\u18f6 \ud800\udd10\ud800\udd3e\ud800\udd2a\ud800\udd28 afdljtq orhlwkn \u1072\u1004\u1001\u1007\u1069\u1042\u104b\u1079 \u310e\u310c\u310a\u310b\u311e\u310b\u312f\u3122 \ua4b6\ua4c3\ua4b0\ua4c9\ua4bc\ua4b6\ua4b0 pwszneyo \uedf5g\u0009\uf581>\ud9c8\udcf4 \uabcc\uabee\uabf5\uabd3\uabec\uabf6\uabd4\uabe1 <scRiPt \ud834\udd84\ud834\udde8\ud834\udd7c\ud834\udd82 l0\u0010\u001b\ue529\ud93a\ude73\u0207 \u1e39\u1edd\u1ec7\u1e1a\u1e90\u1e0b\u1ef6\u1ea2 \ufe95\ufea5\ufecc\ufeb8\ufed8\ufee3\ufefe\ufecb brzwmvjvz \u2d1e\u2d26\u2d1d\u2d00\u2d01\u2d0d\u2d01 \\'&#>\\\" efxgczhte yBIueEda sdgqvnc \ud82f\udec8\u6206\u037d\udbc5\udc55\uf9c7\ud9f4\uddcf evaqczfcg |q({1,5}o kjnlasan \ufe08\ufe02\ufe0a\ufe06\ufe03\ufe04\ufe0f\ufe04 \ueff3P\u2a82\u0913\ud827\ude82\u022e \ud834\udf7a\ud834\udf6f\ud834\udf63\ud834\udf78 \u009e\u079c]\u01ad\u4819\ud82b\udfbfaA waebltqb \u04a8\u9ec5\ud8b9\udff2\ub2b5\u06ed\ud9f3\ude2c\u039b ciolsrbj vd-{1,5}? aoffhni -df-?z(f \u1d82\u1db3\u1da3\u1d9a\u1da4\u1d8b\u1dbe\u1d80 \ud802\udd1c\ud802\udd11\ud802\udd02\ud802\udd0a -g{1,5} \ufe18\ufe1a\ufe17\ufe14\ufe12\ufe15\ufe19\ufe12 iippsej \u044b\ued87\u61ba\uf81b\u011f\u160b\u6a4d Z\u5147\u0745\ub4f1\n\uecad\u07b8 \u0716\u5a10\udb24\udf82\u00c2\ua4b9 \u31aa\u31ae\u31a0\u31af\u31bd\u31aa\u31aa\u31bd\u31a3 eyb-{0,5} ahzesvry rtgaehshs dapujfsz \u0005\u5ec3\ue273\u456e\u5c8b\u0297\u7892\u000fT \u6dbd\ufa6c\u15e9\u05ec\u0018\ud9d5\udce9 \ua4cd\ua493\ua4c0\ua4a8\ua4cb\ua491\ua4bb\ua49e qfgxpxtic \ueae1\u01dd\u1547\u0180\u00df\u0679w\u04ca@ d]x?u|? \\\"\\\"?>\\\"\\ chmnlkxz pfwfodw xvunlpd \u077d\u075e\u0752\u076f\u0760\u075b\u0754\u077b\u0763 (zsx{1, \"?>'</p xjsrhbezs verhwtho ttbdfcdai \u2cbb\u2cd8\u2cb8\u2ce8\u2cc5\u2cac\u2cfc \ud800\ude86\ud800\ude93\ud800\ude9a\ud800\ude8a tpsyxrz \u00e5\u6548\u7c5f\uea44\uecce\u0347\u0001\uf127\u0015 qasgzlvlh \ue914\u8d93\ufd1e\u059c5\u074a\u6971\u0012 \u0355\u0436\ud988\udf3a}\\\u02e4\u3c1bO D\ue5f9[\u0426\udaad\ude7a\u001e\u0005 kdyrhgos \ud803\ude6f\ud803\ude65\ud803\ude6c\ud803\ude71 bthilhly zfxnajxbk iwgbnlosn smWkRFvr ytjgoesrt yezaqmbua \u0ae1\u0aeb\u0ac6\u0a8d\u0adb\u0ac2\u0adf\u0aa1         ptodmfn \u1b41\u1b72\u1b1e\u1b00\u1b56\u1b5d\u1b3c \u4ddd\u4dd5\u4dcb\u4df5\u4dfa\u4dcf\u4df3 poxzluqs .eus{1, ufwspuuu \u30f1\u0604\u0713\uf977\u1490\u0005\u03b1Sg \u01e5\u469b\u5a7c\u2bd7\u068d\uda18\udc83G\u0224 rqoppyypk \ua81dn\u0611\u0009\u3d2a\\ bopwldew {\u001e<\u0673\u041b\ud57e\u464e\ua884 ciqsawsik \uf96a\u48db\u044eV6\u6820\u026e\u3477 nspyshu diinlypyy rgzxwyzq rflidrja clabuitt gvzxdsi \u208a\u2071\u2089\u2071\u2081\u2084\u208a\u2076 xwsbcxy svyxcjll REDfywsim zykehicia \u0607\u062e\uc410\uf43d\uda27\ude7f\u00a9\uc110\u038d xildafrwb \u0749\ufa10\ue286\ueff4\u04a2\u0001\u0731 \u05b0\ufafa\ue0ec\ueaa1\ud891\ude00\u0009 \u16ce\u16f1\u16df\u16c6\u16b0\u16a3\u16ec\u16ab \ud800\udd62\ud800\udd49\ud800\udd8d\ud800\udd55 \ud82d\udfdd\udaec\uddd1\u0771k\u0580\u37e0\u037c \udade\udc2e\udb9c\udf67#\uba9d\u001e\ua198 ]\ud8dc\udc38'\ue67e\u07e8 fphajfxnv \u0466\ud8c8\udf7b\uda1f\udf0d\u0560\u8a13\udacc\ude48 enpbuye ojxxrniq dshgiinyb mtanfsdvp \ud803\ude7a\ud803\ude7a\ud803\ude66\ud803\ude7b \ud802\udc5e\ud802\udc52\ud802\udc5d \uaa65\uaa68\uaa6f\uaa71\uaa75\uaa77\uaa6f kcoepdmop qwuosoej \ueb16\u0483\u0199\ud969\udfaa\uda7a\udc50 \u277d\u27b5\u27ab\u2709\u2733\u277d\u278e\u274c a]q[knl- dvwpstdh bJk-gzs{ \ud808\uddbb\ud808\ude5e\ud808\udc26\ud808\udffc bjvchmve suirqzkkm \u2de7\u2de6\u2df7\u2df1\u2dfd\u2de3\u2de5\u2dee \udb56\udcdeW\u0019\u04e8\ud89a\uddbd cpyjjxtja \udb7b\ude3e\ud8cb\udc2e8\uff58\uda52\udc69 \ud931\udeb6\u02ec\u02bf\u0017\uda29\udfff ehpumjjq psecrcdwx \u2ce5\u2ce5\u2cd1\u2cee\u2cdd\u2ca4\u2c9b geupcpmv jswDYnF dochounge \udb75\udea7\ud92d\udca6\u0012\ufc9e\uca4c harylko whmuyjt ko)hc{0, yqduoufny \u3f20\udb45\udfcfj\u2cad\u01ed\ue020 sgurrpx lbfjxnv \ua84c\ua85e\ua842\ua843\ua84e\ua861\ua85e\ua866 njddumd hgrmcrj \u0112\u001a\u2610\ud83b\udf26\u5b12\u4971\ud919\udccb lmvthdtr \ufb14\ufb23\ufb03\ufb02\ufb35\ufb29\ufb18\ufb03 _\u000f\u036d\ud9ee\udfac\u0361\ud992\udd8a\u52e8 z{1,5}]da hswfutfd \ud800\udfb7\ud800\udfd8\ud800\udfa0 vxtfqzo ijbyldwgt juqtmfys kFysniiJ \\\"&#xc8 y)|w.w] \ud800\udd8e\ud800\udd52\ud800\udd53\ud800\udd89 jegyzlru kqovrysp )s){1,5}. \u1730\u172a\u172f\u1720\u1735\u1737\u1739 vlhwieu'

TestRandomStrings

Test Name:	TestRandomStrings
Test Outcome:	Failed
Result Message:	
Thread threw exception: System.IndexOutOfRangeException: Index was outside the bounds of the array.
   at Lucene.Net.Analysis.Th.ThaiWordBreaker.GetNext() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Analysis.Common\Analysis\Th\ThaiTokenizer.cs:line 178
   at Lucene.Net.Analysis.Th.ThaiWordBreaker.Next() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Analysis.Common\Analysis\Th\ThaiTokenizer.cs:line 166
   at Lucene.Net.Analysis.Th.ThaiTokenizer.IncrementWord() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Analysis.Common\Analysis\Th\ThaiTokenizer.cs:line 106
   at Lucene.Net.Analysis.Util.SegmentingTokenizerBase.IncrementToken() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Analysis.Common\Analysis\Util\SegmentingTokenizerBase.cs:line 87
   at Lucene.Net.Analysis.Core.LowerCaseFilter.IncrementToken() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Analysis.Common\Analysis\Core\LowerCaseFilter.cs:line 59
   at Lucene.Net.Analysis.Util.FilteringTokenFilter.IncrementToken() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.Analysis.Common\Analysis\Util\FilteringTokenFilter.cs:line 89
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.AssertTokenStreamContents(TokenStream ts, String[] output, Int32[] startOffsets, Int32[] endOffsets, String[] types, Int32[] posIncrements, Int32[] posLengths, Nullable`1 finalOffset, Nullable`1 finalPosInc, Boolean[] keywordAtts, Boolean offsetsAreCorrect, Byte[][] payloads) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 242
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.CheckAnalysisConsistency(Random random, Analyzer a, Boolean useCharFilter, String text, Boolean offsetsAreCorrect, Field field) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 1243
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.CheckRandomData(Random random, Analyzer a, Int32 iterations, Int32 maxWordLength, Boolean useCharFilter, Boolean simple, Boolean offsetsAreCorrect, RandomIndexWriter iw) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 928
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.AnalysisThread.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 716
Result StandardOutput:	
Culture: ru-KZ
Time Zone: (UTC+04:00) Astrakhan, Ulyanovsk
Default Codec: CheapBastard (CheapBastardCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=crazy): {}
Nightly: False
Weekly: False
Slow: True
Awaits Fix: False
Directory: random
Verbose: False
Random Multiplier: 1
Result StandardError:	TEST FAIL: useCharFilter=True text='\uf79b\u7e48\u0412\n\ud895\udff53`'


@Shazwazza
Copy link
Contributor

Just ran it again and get a different failure. I'm going to close the solution, rebuild and re-run and see what happens, not sure if these are false positives or not.

TestICUTokenizerCJK.cs

TestRandomStrings

Test Name:	TestRandomStrings
Test Outcome:	Failed
Result Message:	
Multiple failures or warnings in test:
  1) Expected: 〹〤, Actual: 〹

term 1, output[i] = 〹〤, termAtt = 〹
  2) Thread threw exception: NUnit.Framework.AssertionException: Expected: 〹〤, Actual: 〹

term 1, output[i] = 〹〤, termAtt = 〹
   at NUnit.Framework.Assert.ReportFailure(String message)
   at NUnit.Framework.Assert.Fail(String message, Object[] args)
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.AssertTokenStreamContents(TokenStream ts, String[] output, Int32[] startOffsets, Int32[] endOffsets, String[] types, Int32[] posIncrements, Int32[] posLengths, Nullable`1 finalOffset, Nullable`1 finalPosInc, Boolean[] keywordAtts, Boolean offsetsAreCorrect, Byte[][] payloads) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 147
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.CheckAnalysisConsistency(Random random, Analyzer a, Boolean useCharFilter, String text, Boolean offsetsAreCorrect, Field field) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 1248
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.CheckRandomData(Random random, Analyzer a, Int32 iterations, Int32 maxWordLength, Boolean useCharFilter, Boolean simple, Boolean offsetsAreCorrect, RandomIndexWriter iw) in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 928
   at Lucene.Net.Analysis.BaseTokenStreamTestCase.AnalysisThread.Run() in C:\Users\Shannon\Documents\_Projects\Lucene.Net\Lucenenet.4.x-copy\src\Lucene.Net.TestFramework\Analysis\BaseTokenStreamTestCase.cs:line 716
Result StandardOutput:	
Culture: ru-KZ
Time Zone: (UTC+04:00) Astrakhan, Ulyanovsk
Default Codec: CheapBastard (CheapBastardCodec)
Default Similarity: RandomSimilarityProvider(queryNorm=False,coord=crazy): {}
Nightly: False
Weekly: False
Slow: True
Awaits Fix: False
Directory: random
Verbose: False
Random Multiplier: 1
Result StandardError:	TEST FAIL: useCharFilter=False text='\u300f\u3011\u302b\u3039\u3039\u3024 �\uc58a\u0010\ud815\udd05\ucc84\uc07e bpy'
``

…es outside of the loops that they were nested in so they can be reused.
…ch BytesRef() out of the loop (as it is in Lucene)
…de of the loops they were nested in so they can be reused (like in Lucene)
…ed to use ReaderWriterLockSlim to make reads more efficient
…der: Refactored to use ReaderWriterLockSlim to make reads more efficient
…ter: Refactored to use ReaderWriterLockSlim to make reads more efficient and used LazyInitializer for readerManger and taxoArrays
…ock(this), implemented proper dispose pattern
…ed all members to be virtual to allow users to provide their own LRU cache.
…locking on Dispose() method and made it safe to call dispose multiple times
…se(bool) and implemented proper dispose pattern. Avoid lock (this).
…d, inlined TryGetValue out variable declarations, cleaned up namespaces, added operator declarations (where recommended), marked fields readonly, removed unused fields, cleaned up line spacing
…ameIntCacheLru, NameHashInt32CacheLRU > NameHashInt32CacheLru. Refactored to utilize a generic type internally using composition to avoid boxing/unboxing without exposing the generic closing type publicly. Added public INameInt32CacheLru as a common interface between NameIntCacheLru and NameHashInt32CacheLru.
… generic struct that can be used in both TopOrdAndSingleQueue and TopOrdAndInt32Queue. Added Insert method to PriorityQueue to allow adding value types without reading the previous value for reuse.
…ter: Reverted back to using lightweight locking instead of using ReaderWriterLockSlim and LazyInitializer, as performance is better in this configuration.
… out call to EnterUpgradeableReadLock() so reads can happen concurrently.
… BytesRef instantiation outside of outer loop, which significantly improves performance
@NightOwl888 NightOwl888 merged commit 94a3e1e into apache:master Aug 22, 2020
NightOwl888 added a commit that referenced this pull request Aug 22, 2020
…er has unfair locking on Monitor.TryEnter(), the code was restructured to disallow any thread that doesn't have a lock into InternalTryCheckoutForFlush(). See #325.
@NightOwl888 NightOwl888 added this to the 4.8.0-beta00012 milestone Sep 7, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

The design and implementation of a better ThreadLocal<T>
2 participants