Results 1 to 2 of 2
  1. #1
    3 Star Lounger
    Join Date
    Jan 2001
    Serbia and Montenegro (Yugoslavia)
    Thanked 0 Times in 0 Posts

    Complex Report in Access (Access 2K or XP)

    We are running a stress test of a new information system we are about to implement. Using WinAPI, we are going to execute functions as if an end user were at a terminal. Every two minutes we are going to launch another program on a different terminal until we have 90 terminals doing this. Every time the program starts an event and ends an event, it will log the time of each and the event name to a log file that is ascii dilimited. I am going to be given the data to calculate its performance.

    I've created a test table with columns of TerminalName, EventStart, EventEnd, and EventName. Ideally I would like to graph the performance, but looking at the data I'm not sure how I would approach this.

    I am assuming that as more computers begin processing, the time in execution of events being executed would drop some degree. I can calculate the seconds between the start and end times. I invision a graph that has an X axis of number of processors running and Y axis of the processing time. Perhaps even a 3D graph to show multiple events at the same time.

    I'd like any suggestions on where to even begin, and if it would be easier to run this in access or attempt to take the data and crunch it in Excel.

  2. #2
    Super Moderator
    Join Date
    Aug 2001
    Evergreen, CO, USA
    Thanked 58 Times in 58 Posts

    Re: Complex Report in Access (Access 2K or XP)

    It seems to me one of your bigger challenges is to determine how many PCs are currently executing. For one thing, synchronizing the clocks on each PC will be a significant task. Also will a new event be launched at a workstation immediately (within milli-seconds) after the completion of an event? So if an event starts when N computers are running, a new PC starts so that N+1 are running, and then the event completes, how many computers do you want to count as running? Finally, in addition, the PCs characteristics (speed, RAM, hard drive condition) and the LAN topology will affect the results. So it's a pretty complicated analysis.

    Given that the above issues can be quantified, I would go with an average computed for each given number of computers that were running, and plot it on the X/Y graph as you suggest. I usually favor Excel for doing graphs, but that's mostly because it used to have the best tools. These days the tools are pretty comparable. It may go a little faster in Excel because the data is pretty much all resident in RAM or on the swap file. Not sure this suggests where to begin - it seems to me it would be desirable to have some idea of how long an event takes to run when the PC is the only one running. That should give you a best case benchmark, and also give you some idea of how many records you are going to create.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts