Results 1 to 4 of 4
  1. #1
    New Lounger
    Join Date
    Mar 2011
    Location
    Indiana
    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Unhappy Long time listener....first time caller

    I'm in need of severe help and not sure where to begin.

    Quick overview. Moving three SharePoint tables into Access 2007 (forms, reports and the works). When it is all said and done, I'll do a final data migration over a weekend. Amoung other things, I've been creating some update queries to normalize these three tables into something managable.

    This has worked ok for two of the tables, but the third has 240 fields (is why they needed to move to a real database). I had no problem importing the table into Access. I broke down the "master" table in managable chucks. Update query worked fine (in dataview). Run everything and then update query stops me as soon as it hits his huge table, because it can't find the first field (prompting me to key in the data). I traced it backward and found when I open this 240 field master table, it won't let me make any changes. I get a "property value too large" error. Playing around I realize if I just delete a couple dozen fields, the table is happy again. But, of course, I'm not, and neither will my customers. Access is supposed to handle 255 fields. Not sure what that deal is.

    Whatever I do to fix this, I have to be able to automate it somehow. I'm at lost where to go with his, while I have my boss on the phone telling me they're in a critial state, and they need my db done asap (yep, no pressure there ).
    Last edited by ken90004; 2011-03-01 at 18:52.

  2. #2
    Super Moderator
    Join Date
    Aug 2001
    Location
    Evergreen, CO, USA
    Posts
    6,623
    Thanks
    3
    Thanked 60 Times in 60 Posts
    Unfortunately I think you are going to have to do more work on the design of the table. Access can handle up to 255 fields in a table, but that is an absolute limit, and when you start trying to run queries against a table approaching that limit, you are certain to run into the kind of problems you are seeing. Queries are also limited to 255 fields, but there is also a limit on the size of the query string, and you may be hitting that when you try to run it as an update. Take a hard look at the design of the master table, and I would be almost certain you will find it isn't normalized. The largest table I've ever seen in a database that has been properly normalized had less than 80 fields. Unfortunately redesigning the table won't solve the immediate crisis.

    You might look at storing the table in SQL Server as an interim solution - it allows up to 1024 fields in a table, but if you are trying to utilize forms and reports with an Access front-end you are still almost certain to run into problems. And I suspect SharePoint is running into issues with that kind of design as well. One thing you might look at in the process of trying to normalize the data is to use append queries to create new tables from the existing table, and run those processes in SQL Server - append queries tend to use less resources than update queries. If you are still stuck, you might try posting the details of the current table design with a few sample records.
    Wendell

  3. #3
    New Lounger
    Join Date
    Mar 2011
    Location
    Indiana
    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Thanks for the reponse and the sugguestions.

    I work for the department of defense as a civil contractor, so I'm not sure how much data I could post without it getting me jailed. I recommended SQL Server from the getgo, but the final decision was that we'd have to use Access because of security reasons. no clue.

    My problem is the users are still entering data into the SharePoint tables (or trying at least), and will be until I go live. So that weekend, I'll have to import everything into Access, including this silly big table. I have to execute all my data migration queries, and they will normalize the data structure. Afterward, this huge table is going bye-bye as it'll turn into 15 tables, if I can get past this problem.

    I tested the SQL string theory, and it didn't appear too large, but I went ahead and cut it down to a one field update query. Still the same problem. I even tried updating different fields instead of my problem one, and the same problem occurs across the board.

    This is a one time use function, and I've wasted more time on this than it would take me to just do it manually. I'm thinking that is my fix. I'll just manually copy and paste. I'll change all of the queries to select query (works fine as selects), and copy and paste all fifteen tables. Won't take five minutes.

    Sometimes it comes down to the question of time management.

    Thanks again. You're really helped me work through the thought process.

    ~Ken

  4. #4
    New Lounger
    Join Date
    Mar 2011
    Location
    Melbourne, Australia
    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hi Ken,
    maybe a stupid solution -
    perhaps when importing, it is creating some temporary fields that exceed the field length.

    You state "Playing around I realize if I just delete a couple dozen fields, the table is happy again. But, of course, I'm not, and neither will my customers."
    Perhaps if you chop the 240 field table into 2 and import 120 fields at a time, then combine the two tables into one?

    Cheers
    Earle

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •