Results 1 to 3 of 3
2001-09-05, 15:22 #1
- Join Date
- Jan 2001
- Charlottetown, Prince Edward Island, Canada
- Thanked 0 Times in 0 Posts
i am trying to use syncsort to perform a sort and sum of a variable length file that contains anywhere from 300-400 million rows of data. the issue is the amount of disk spaces required to do this is maxing out resources. It is running on a sun system. The file itself is actually fed by 24 independent files that are merged into one before a final step.
are there any chances of efficiencies here or maybe a better way of using syncsort prior to the last step or during the last step?
2001-09-05, 15:43 #2
- Join Date
- Dec 2000
- Los Angeles, California, USA
- Thanked 3 Times in 1 Post
I don't know anything about Syncsort but "running on a sun system" makes me think maybe somebody on the <A target="_blank" HREF=http://www.wopr.com/cgi-bin/w3t/postlist.pl?Cat=&Board=palm>Linux board</A> might be more likely to have an answer.<IMG SRC=http://www.wopr.com/w3tuserpics/Eileen_sig.gif>
2001-09-05, 17:22 #3KTYorkeGuest
Please post any suggestions you may have in <A target="_blank" HREF=http://www.wopr.com/cgi-bin/w3t/showflat.pl?Cat=&Board=palm&Number=71310&page=0&vi ew=collapsed&sb=5&o=0&fpart=>this thread</A> over on the Other OSes board.