Results 1 to 3 of 3
2014-08-12, 11:57 #1
- Join Date
- Dec 2009
- New York
- Thanked 0 Times in 0 Posts
Communicating and Sharing among Excel Workbooks while running Web Query Macros
I am not sure if this will end up being a spreadsheet question, but that is definitely where it starts. I may be doing things with the wrong tools or I may need to use other tools in conjunction with Excel. I am a relative neophyte when it comes to Excel, since I have never had any training in it (my last spreadsheet training was on Lotus 1-2-3, version 2.01). I have been using Excel for rudimentary things up to about a year ago and this is the first time I have attempted anything more complex than forecasting an Alternative Minimum Tax computation (which was actually converted from Quattro Pro, not originally built in Excel).
Here is what I am currently doing:
I have a series of Excel Workbooks each monitoring activity on several websites (one that requires a login). We can think of this as monitoring security A, B, C, D, etc. (one per workbook) though we are not actually talking about securities. Each spreadsheet stores its data locally and reports it on a worksheet. A VBA macro runs continuously to refresh the web information periodically so that it is at most about a minute old. On one machine I can run 4-6 instances of this workbook (each monitoring something different) depending on memory and CPU resources available. Each of them does its own webquery and updates the sheets internally for the display. For a variety of reasons, they cannot easily be modified to monitor and report all items (or even more than one) and it would increase their size significantly (each Workbook is 5.6 to 6.0 Mb in size and will use about 150 MB of memory when logged into the website and running the VBA code). Each of these currently must run in a separate Excel instance or they will "interfere" with each other since they are all running the same VBA macro.
I am trying to re-architect these so that:
- They share information from a single web query rather than each logging in and retrieving their data separately
- They report their information to a separate "dashboard-like" Workbook for easier viewing (I can view what is displayed currently, but need to interrupt the macro to scroll around to see other information on the current sheet).
I am looking for ideas and suggestions on ways others may have achieved any of the above.
To do #1, what I am currently attempting (on a separate machine) is to run all the queries in a separate Workbook (the WebQuery Workbook) so that it has a single login and simply scrolls through each activity gathering data on three different worksheets per activity. I currently have it set up to monitor up to 12 activities from three source tables, and it seems able to run these 36 queries in under 2 minutes. This workbook is then saved on a SugarSync-shared drive and then again on a local drive before resuming the set of web queries. On the other machine, each of several Calculation Workbooks will periodically copy data from the WebQuery Workbook (opened in read-only mode) into the pages it would have normally loaded via a web query. This has significantly reduced the memory usage and improved overall speed although it does seem to provide a single point of failure which never existed before. The website can become "busy" and force a query to fail and interrupt the VBA Macro. Not sure if there is a way around this (perhaps there is a way to continue on error?) but would love ideas here too. I am in the process of testing this alternative and it seems to be somewhat working (it seems a bit finicky and one of the macros in a workbook that was reading the data generated an odd value error in yesterday's test which I could not duplicate).
To do #2, I am not sure how to proceed. I have considered copying the currently active sheet from each Calculation Workbook to a separate workbook (on a shared drive) and then use a similar method to import these into a spreadsheet that combines data from all of the workbooks, but then, this combined spreadsheet would still need to be constantly running a VBA Macro to refresh the sheets and I would still not be able to page from sheet to sheet nor scroll around within one without interrupting the VBA Macro. The sheets I want to browse are quite complex (from my perspective) with over 20,000 cells populated with computed data and probably over 400 cells which I may want to review at any given time and graphs of data trends over time (sparklines). Currently, I am able to combine this information to display information from about 70 of these cells on the screen while the macros are running. If I am interested in looking further, I stop the macro and look at the sheet and then restart it. At the end of each update cycle, the VBA Macro displays a timed popup window asking if I want to continue or not so that it never is interrupted during the middle of a data retrieval/population cycle so that I can stop the macro in an orderly fashion (and not accidentally stop the wrong one). If I ignore the popup, the macro simply continues after the timer is up.
My reason for posting is that I am at a somewhat critical decision point now. I have spent the past year tweaking these individual workbooks to gather/display/report the kind of information that I believe is relevant. I am now trying to improve the functionality so that they are a bit more efficient, reliable, and usable.
Any thoughts on either how I should be gathering the web data and making it accessible to the Workbooks or extracting the computed sheet data from the Workbooks into some form of reporting mechanism that can refresh itself in the background?
Thanks for your help (even if you have just read this all and are merely scratching your head).
2014-08-21, 21:53 #2
- Join Date
- Aug 2010
- Pa, USA
- Thanked 408 Times in 371 Posts
1. The first thing that I would consider is converting the files to .xlsb format to reduce the size and memory consumption while speeding up the performance a bit.
2. To make sure that your data is loading as fast as possible, I would attempt to get the fastest Internet download speeds possible. If you are using wireless, you may consider an Ethernet connection instead if possible. Not knowing your setup, a router upgrade may also be in order. If using wireless and available, set up Quality of Service (QoS) to give your computer priority bandwidth over other computers using the same connection.
3. Implementing a dual monitor setup may negate the need to scroll around on you monitor and to interrupt the running macros.
4. In the connections Properties window, make sure that "Preserve cell formatting" is checked then remove as much formatting as possible from the worksheet cells. This should prevent the cells being manipulated by the formatting of the downloaded data without having any formatting of their own slowing things down.
5. Make sure your macros start and end with the code lines Application.ScreenUpdating = False and Application.ScreenUpdating = True, respectively.
6. I agree that running all the queries in a separate workbook is a good idea. I would suggest to replace as many macros with formulas as possible. It is far faster to pull data from workbook to workbook with formulas instead of VBA. With out knowing your application, it is impossible to evaluate the scope of using formulas as an alternative. I would think it would be a better setup to pull the data from the query workbook as opposed to the individual computers expecting the data to be pushed to them from a system that is already being burdened. Therefore, You should be running those macros from the "client" computers.
Hopefully, something here may help. I will post if anything else comes to mind. Keep us updated.
2014-08-28, 08:36 #3
- Join Date
- Apr 2014
- Austin, TX
- Thanked 36 Times in 34 Posts
Web scraping especially for stock market data is something I do a lot. If you can send your file(s) to email@example.com I'll take a look at simplification. If not, at least post your code for comments. I would probably use ONE file for all. ???