Support has been upgraded!
The Support Forum is closed. Not to worry! Providing the top quality support you expect and we're known for will continue! We're not ending support, just changing where you submit requests. This will provide you with the best experience possible.
Premium Support
Have you purchased an addon for Connections such as one of our premium templates or extensions with a valid license and you need help?
Please open a Support Ticket in your user account.
Free Support
Are you using the free Connections plugin? Don't worry, you are still very important to us! We are still providing you with the same high quality support that we're known for.
Please open a new support topic in the WordPress support forums for Connections.
Tagged: 8.5.29, csv import, extension
- This topic has 9 replies, 2 voices, and was last updated 6 years, 12 months ago by
Steven Zahm.
-
AuthorPosts
-
11/16/2016 at 1:34 pm #400080
Jason
GuestI have been trying to import 90,000 records and the process is incredibly slow.
I split the file into segments of 5000 records and the first 20000 records worked. It took about 30 minutes for each batch of 5000 records to upload.
The next 10,000 slowed way down. It took about 2 hours for each 5000 records to upload
I’m up to 30K records and it has slowed to a crawl.
I’m building this site on a medium range dedicated server which has no traffic at the moment. 8 core/16 threads 32 GB of ram.
Is there anyway to speed this up?
I even tried doing this on a local machine and had the same issues.
11/17/2016 at 9:28 am #400182Steven Zahm
Keymaster@ Jason
Hmmm… large import like this I do breakup into 10K files. The larger the file the larger it takes to import because it has to be parsed each time by the CSV library for each 100 imported.
Now, each new file should start quickly before “settling down” and each file should take about the same amount of time.
So, to clarify, what you’re seeing is each file is taking longer than the preceding one?
Things that can be done to speed up import…
Remove all empty blank columns in the CSV file. For example, if none of the entries you are importing contain a Linkedin profile, delete that column. The fewer the columns, even if empty, the less processing the CSV library has to do.
Install the Code Snippets plugin and add a snippet with following code:
add_action( 'plugins_loaded', 'cn_remove_geocode', 11 ); function cn_remove_geocode() { remove_filter( 'cn_set_address', array( 'cnEntry_Action', 'geoCode' ) ); }
Each address is sent to Google for geocoding for lat/lng. With this many imported entries they are likely no longer geocoding as they only allow so many within a rolling 24 hour period unless you a Google Maps API Premier subscriber (I think that’s the correct term).
It’s only a fraction of a second per address, but it does add up.
If you want, you can send me you split CSV files and I can do some testing to make sure there is nothing within them that would some be slowing things down.
Oh, which version of the CSV Import are you using? The latest?
11/17/2016 at 10:46 am #400244Jason
GuestHi Steve,
Thank you for the response.
It seems each import does go slower. I am breaking up into 5K records.
Good to know about deleting the unused columns I will give that a try.I’m using version 1.4.
Since, I have already Geo coded my addresses I’ll try the code snippet.
I’ll try your suggestions to see if that helps.
Thanks,
Jason11/17/2016 at 11:10 am #400246Steven Zahm
Keymaster@ Jason
Let me know how it goes.
12/05/2016 at 2:51 pm #402276Steven Zahm
Keymaster@ Jason
I’ve played with this a bit more… the correct snippet to remove geocoding addresses is this:
add_action( 'admin_init', 'cn_remove_geocode', 11 ); function cn_remove_geocode() { remove_filter( 'cn_set_address', array( 'cnEntry_Action', 'geoCode' ) ); }
In testing on Bluehost (a business cloud account) bulk geocoding addresses causes issues. My best guess is that Bluehost is limiting external connections. Perhaps to prevent their hosting services from being used for denial of service attacks. I dunno, just speculation but I wanted to follow up with you in case it helps you out too.
12/08/2016 at 1:26 am #402657Jason
GuestHi Steve,
I started a new project and I’m having the same issues.
I did add the code snippet to disable the Geocoding. My records are pre geo coded.
Also, I built the CSV file in LibreOffice 5.2
I’m still having issues even with files uploading. I’m working on a 100K list right now. I broke it into 10K segments. The first 10K took about 14 minutes to upload which is works.
The 2nd 10K took 42 minutes to upload.
The 3rd 10k took 49 minutes to upload.This is a fresh WP install. No other plugins and using the 2016 theme.
This is the server I am running on: https://www.ovh.com/us/dedicated-servers/details-servers-range-HOST-id-2016-HOST-32L.xml
I’m running WHM Version 60.
I checked my server stats while doing an upload and I’m only hitting 10% of the ram and running 1.7 on the CPUs, so hardware wise I should be good.I more then happy to send you the list as well as the system Info download. Just let me know where to send it.
Thanks in advance for your help.
12/08/2016 at 2:00 am #402658Jason
GuestWhile its not ideal is there a way to import this using myphp?
I was looking at the “connnections” related tables and they all seem straightforward except for the main table labeled “connections” I don’t understand how the address and the phone entries are built.
12/08/2016 at 10:01 am #402729Steven Zahm
Keymaster@ Jason
Hmmm… I would expect each 10K import to take the same amount of time, not increase. I can not think of any reason off the top of my head which would cause this.
Go ahead and contact me via the Contact link at the very bottom of the page so we can take the discussion to email and get this resolved much quicker for you. I’m going to want a copy of the 100K and the 10K chucks CSV files which you be able to send me when I respond to your contact form submission. I’ll also want a screenshot showing the mapping that you are using that way I can duplicate your import and hopefully track down the code logic that is causing performance issues.
Also, just so I am on the same page, your server, it is dedicated, correct? So nothing else can be taking resources when you are importing?
As for a direct import into the database, the columns for address, phone and such are serialized arrays of the data stored in the separate tables. These are used as caches so during display of the directory only the main tale needs hit with a query vs making many, many queries to multiple tables.
Looking forward to hearing back and resolving this.
12/13/2016 at 11:59 pm #403529Jason
GuestSteve,
I wanted to thank you for your efforts in resolving this problem. Your solution far exceed my expectations. Once I updated to the latest version (with the fixes) the time to upload 100,000 records was cut from almost days, to just over an hour.
Additionally, the responsiveness, communication and level of support you provided is substantially better then any WordPress developer I have work with in the last 7 years.
Once again thank you!
Jason
12/14/2016 at 10:04 am #403580Steven Zahm
KeymasterThanks!
-
AuthorPosts
You cannot reply to this support topic. Please open your own support topic.