Page 1 of 2 12 LastLast
Results 1 to 15 of 20
  1. #1
    Star Lounger
    Join Date
    Jan 2008
    Posts
    51
    Thanks
    0
    Thanked 0 Times in 0 Posts
    I have two tables that may contain information a user is looking for. One is the "active" table; the other is an "archived" table. Information gets archived periodically, but sometimes may need to be brought back to the active table. Users generally access the active table (by entering a Customer ID in a form), but if they enter a customer ID which does not exist in the active table, I use a DLookup function to check the archived table. If it exists there, they're given the option of viewing the archived record and/or bringing it back to the active table. The problem is that, over time, the archived table is getting huge and the DLookup is getting slow. Is there a method of looking the customer ID up in the archived table that is faster than DLookup? I've read that opening a recordset is faster than DLookup. Is that true? How should I handle this situation? The database is split - the data physically resides in a separate database on the server.

  2. #2
    Star Lounger
    Join Date
    Dec 2009
    Location
    Mexico City, D.F., Mexico
    Posts
    81
    Thanks
    0
    Thanked 0 Times in 0 Posts
    The best way to speed your query is to make sure that there is an index CustomerID on the table Archive.
    This eco-post is made of recycled electrons

  3. #3
    Star Lounger
    Join Date
    Jan 2008
    Posts
    51
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Both tables are indexed by CustomerID. Any other suggestions?

  4. #4
    5 Star Lounger AndrewKKWalker's Avatar
    Join Date
    Apr 2001
    Location
    Cambridge, UK
    Posts
    1,020
    Thanks
    0
    Thanked 3 Times in 3 Posts
    I read somewhere, that DLOOKUP was optimized for Local tables, but a Recordset was better on linked tables.
    Quite willing to be corrected on that though.

    Have a look at Allen Brownes ELookup Function as well. ELOOKUP
    Andrew

  5. #5
    Gold Lounger
    Join Date
    Jun 2001
    Location
    Crystal Beach, FL, Florida, USA
    Posts
    3,436
    Thanks
    1
    Thanked 34 Times in 34 Posts
    Quote Originally Posted by odeus View Post
    I have two tables that may contain information a user is looking for. One is the "active" table; the other is an "archived" table. Information gets archived periodically, but sometimes may need to be brought back to the active table. Users generally access the active table (by entering a Customer ID in a form), but if they enter a customer ID which does not exist in the active table, I use a DLookup function to check the archived table. If it exists there, they're given the option of viewing the archived record and/or bringing it back to the active table. The problem is that, over time, the archived table is getting huge and the DLookup is getting slow. Is there a method of looking the customer ID up in the archived table that is faster than DLookup? I've read that opening a recordset is faster than DLookup. Is that true? How should I handle this situation? The database is split - the data physically resides in a separate database on the server.
    For a single lookup like that, I don't think opening a recordset is any faster. I trust you have compacted the db with the archived table?

    I'm assuming this is a front-end/back-end situation? If so, you can open the the archive table directly (not as a linked table), and then use SEEK to find the record. This is as fast as it gets.
    Mark Liquorman
    See my website for Tips & Downloads and for my Liquorman Utilities.

  6. #6
    2 Star Lounger
    Join Date
    Sep 2002
    Location
    Chicago, Illinois, USA
    Posts
    116
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by MarkLiquorman View Post
    For a single lookup like that, I don't think opening a recordset is any faster. I trust you have compacted the db with the archived table?

    I'm assuming this is a front-end/back-end situation? If so, you can open the the archive table directly (not as a linked table), and then use SEEK to find the record. This is as fast as it gets.

    Mark,
    What is the best way to do this, i.e., open the table directly? Use the IN external file clause in a query? Use a DAO Workspace? Other?

    Thanks

  7. #7
    Star Lounger
    Join Date
    Dec 2009
    Location
    Bellevue, WA
    Posts
    61
    Thanks
    1
    Thanked 0 Times in 0 Posts
    How big is "huge"? How many records in the archived table?

    Why not just write a select statement for a recordset that gets only the specific record?

  8. #8
    Gold Lounger
    Join Date
    Jun 2001
    Location
    Crystal Beach, FL, Florida, USA
    Posts
    3,436
    Thanks
    1
    Thanked 34 Times in 34 Posts
    Quote Originally Posted by jswedlund View Post
    Mark,
    What is the best way to do this, i.e., open the table directly? Use the IN external file clause in a query? Use a DAO Workspace? Other?

    Thanks
    It's been awhile, and I'm answering this remotely (I'm not at a computer where I can check syntax). Basically, it would be:

    dim db as dao.database.
    dim rs as dao.recordset

    set db = ... (it is here you'd directly reference the backend, can't remember syntax)
    set rs = db.Openrecordset("tablename",dbOpenTable)

    I'd make these Public variables, so they are kept open (otherwise, you lose speed advantage). then when you need it:

    rs.Seek ...

    You'll have to use Help to fill-in the blanks. Like I said, it has been awhile.
    Mark Liquorman
    See my website for Tips & Downloads and for my Liquorman Utilities.

  9. #9
    2 Star Lounger
    Join Date
    Sep 2002
    Location
    Chicago, Illinois, USA
    Posts
    116
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thanks, Mark.

    I tested and see that this would work, but am wondering whether a query that uses the IN 'BackEnd.mdb' clause (as the recordset might work as well.

  10. #10
    Super Moderator
    Join Date
    Aug 2001
    Location
    Evergreen, CO, USA
    Posts
    6,623
    Thanks
    3
    Thanked 60 Times in 60 Posts
    Larry's question with regard to what is large is very relevant - we do searches on table containing 4 million or more recors and get subsecond response time, but the table has to be indexed properly, and you need a robust server that can keep the data in memory. (We link to a SQL Server backend.)
    Wendell

  11. #11
    Gold Lounger
    Join Date
    Jun 2001
    Location
    Crystal Beach, FL, Florida, USA
    Posts
    3,436
    Thanks
    1
    Thanked 34 Times in 34 Posts
    Quote Originally Posted by jswedlund View Post
    Thanks, Mark.

    I tested and see that this would work, but am wondering whether a query that uses the IN 'BackEnd.mdb' clause (as the recordset might work as well.
    Yes, it probably will. The key is to define the recordset as public so you don't have to constantly recreate it. That's where you lose time.
    Mark Liquorman
    See my website for Tips & Downloads and for my Liquorman Utilities.

  12. #12
    2 Star Lounger
    Join Date
    Sep 2002
    Location
    Chicago, Illinois, USA
    Posts
    116
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by MarkLiquorman View Post
    Yes, it probably will. The key is to define the recordset as public so you don't have to constantly recreate it. That's where you lose time.
    Good point, thanks.

  13. #13
    Star Lounger
    Join Date
    Jan 2008
    Posts
    51
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thanks for everyone's feedback. I don't have 4 million records - more like 500,000 (and growing) - but I don't have a "robust server". This is a front-end/back-end situation, but I'm only using Access tables as my backend not SQL server, so I'm hampered by speed this way and the fact that the network is slow. I'll re-write the code to open a recordset directly as suggested by Mark and see if that speeds things up. Thanks for your help.

    As an aside, Mark recommends "Define the recordset as Public".....is there an issue with keeping a recordset open? In the recesses of my mind, I remember being told "get in, get out" when dealing with data sets. "open the database, get your data and close the connection". Am I locking the records and leaving the recordset open by defining it as public? Will this be setting myself up for corruption or frustration by other users trying to access the same records moments later? I'll admit my schooling was several years ago and maybe things have changed....

  14. #14
    Star Lounger
    Join Date
    Dec 2009
    Location
    Bellevue, WA
    Posts
    61
    Thanks
    1
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by odeus View Post
    Thanks for everyone's feedback. I don't have 4 million records - more like 500,000 (and growing) - but I don't have a "robust server". This is a front-end/back-end situation, but I'm only using Access tables as my backend not SQL server, so I'm hampered by speed this way and the fact that the network is slow.
    I still think you would be best served by a simple query in code (e.g. Select * from tablename WHERE criteria here; Remember, the jet engine is optimized for queries.

    I just ran a test against a 1.6GB table with 1.5+ million rows. The table is a linked Access table. With a query, selecting a specific row (against an indexed field) took well under 1 second. The linked table is on the same PC, so it is not slowed down by network access - which can be a significant slowdown. A slow network is even more reason to select a single row (or the minimum number of rows you need). I suspect (without testing) that having an open recordset that you use seeks on would be slower. I could be mistaken on that, but don't think so. I'd be interested in a comparison. Come to think of it, I might do that and let you know if you are interested. Let me know. It would not take long to test.

    Let us know how it goes.

  15. #15
    Gold Lounger
    Join Date
    Jun 2001
    Location
    Crystal Beach, FL, Florida, USA
    Posts
    3,436
    Thanks
    1
    Thanked 34 Times in 34 Posts
    Quote Originally Posted by odeus View Post
    Thanks for everyone's feedback. I don't have 4 million records - more like 500,000 (and growing) - but I don't have a "robust server". This is a front-end/back-end situation, but I'm only using Access tables as my backend not SQL server, so I'm hampered by speed this way and the fact that the network is slow. I'll re-write the code to open a recordset directly as suggested by Mark and see if that speeds things up. Thanks for your help.

    As an aside, Mark recommends "Define the recordset as Public".....is there an issue with keeping a recordset open? In the recesses of my mind, I remember being told "get in, get out" when dealing with data sets. "open the database, get your data and close the connection". Am I locking the records and leaving the recordset open by defining it as public? Will this be setting myself up for corruption or frustration by other users trying to access the same records moments later? I'll admit my schooling was several years ago and maybe things have changed....
    You want to minimize the # of recordsets you have open, as it takes up memory. But the performance benefits of holding open a recordset connected directly to the backend outweighs the memory penalty of keeping it open.
    Mark Liquorman
    See my website for Tips & Downloads and for my Liquorman Utilities.

Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •