Results 1 to 2 of 2
  1. #1
    2 Star Lounger
    Join Date
    Jan 2001
    Location
    Indiana, USA
    Posts
    107
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Controlling Quality of Data Entered

    This may be a stupid request, it may be common knowledge to the experts, or it may have been discussed in the forum in the past. I am new to the forum and somewhat new to Access (97). But, I have a question concerning quality control.

    Currently, our data entry staff enter data directly into the tables through forms. What I would like to know is, can Access hold data updates in a temporary table that can be reviewed for accuracy and then have the temporary table's content moved into the real tables to overwrite the old data for permanent storage?

    I have tried to run Append and Update Queries. They have appeared to just added new records to the main table. When I did this, I may not have had a key field or referential integrity turned on in the main table. Paradox 5 did allow record updates, but I haven't had any luck with Access. I am sure I am doing something wrong, but I don't know what. I have read the online help and have not been able to find anything in it to assist me. <img src=/S/doh.gif border=0 width=15 height=15>

    Any guidance, assistance, or reference to good books to read will be greatly appreciated. TIA. <img src=/S/frown.gif border=0 width=15 height=15>

  2. #2
    Plutonium Lounger
    Join Date
    Dec 2000
    Location
    Sacramento, California, USA
    Posts
    16,775
    Thanks
    0
    Thanked 1 Time in 1 Post

    Re: Controlling Quality of Data Entered

    Think about what you're trying to do first. The primary reason for doing batch updates is to speed up data entry. Management usually really likes anything that will do that. The primary reason for NOT doing batch updates is that you don't get bogus records in the first place if you validate up front.

    It's simpler (but slower) to keep out bad entries in the first place by doing your validation at the field or control level than it is to look at each record after the fact and try to figure out how to correct whatever may be wrong with it. As a result, the quality control that's supposed to get done may wind up being skipped. The usual argument is that the record should be "kicked back" to the operator later for batch corrections. In practice, the record may simply be flagged as OK by the operator without any corrections being made. The time saved at the front end can cost you in terms of data integrity. "Hello, this is the voice of experience speaking." <img src=/S/sick.gif border=0 width=15 height=15>

    The Borland Database Engine has always used a batch update approach, going back to the earliest days. Nothing is really saved until you update the master. The Jet engine skips the two-level save, except through the use of transactions, which don't really apply to forms. With Jet, you are always working in the master and everything happens now. It is possible to do batch updates on recordsets, either with transactions in DAO or directly in ADO, but this generally involves unbound forms and substantial coding to make it work.
    Charlotte

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •