Hi there,
did you ever get more info on this? I am actually in the middel of a conversion of legacy documents to Alfresco. I am using FTP as method of uploading, however it just get's slower and slower. At this rate it'll take far too long. I have about 10 years of scan-archives to upload in a matter of 2 months, while not interfering with the current new uploads and searches done by users. Funny how managers always assume it all just magically works out according to an imaginary schedule.
At first glance it doesn't seem to be a high CPU usage, databaseload or maxing out memory. It actually went pretty fast when we started out approx. 100.000 files in about 6-10 hours, just right for a nightly batch. Now it takes more than 3 secs per file which means it has to run day and night. Fortunately it doesn't seem to bother the overall performance much, other uploads just pass right through without having to pause the bulk upload, and no user has complained so far about slow performance or unresponsiveness.
I am at a loss as to where the slowdown may be originating. Possibly the indexing has something to do with it, but I couldn't be sure. Any info on tweaking performance in Bulk uploads would be very welcome.