5

The LA Fox Developer Newsletter
February 1994
XPro User Group News (con’t)
tems Development department was developing
strategy for the BIG applications, the Partner
Information System (Parlnfo) and the Mailing
Development System (Campaign).Central to the
whole development effort was the Data Modeling
Department, they were involved in the develop-
ment process for every application and were
responsible for ensuring that each piece that was
developed would fit into the overall “puzzle”. I
believe that this effort, though tedious during the
planning stages, was one of the key ingredients
that kept things together. During the initial phase
we developed our own implementation of the
foundation read model and we have used that for
all butthe very early applications. We had weekly
“programming Standards” meetings where we
shared our experiences, discussed standard
functions, etc, and gave overviews and status
reviews of the pieces we were working on. We
even argued a little <gnn>.We developed an “in-
house” system we call “Configuration Manage-
ment” (CM) to assist in the management of multi-
programmer projects. The system, written in
FoxPro, allows programmers to check-out pieces
of an application and provides a reasonable
amount of protection against one programmer
thwarting the efforts of the others. The Parlnfo
and Campaign systems were divided up into teams
and a team leader was assigned for each team.
The various department managers and the team
leaders worked closely together to ensure that the
various pieces would all fit together. Individual
programmers sort of floated around between
teams, depending on their individual strengths and
experiences and the immediate needs of the team
and project at hand. At the final count, there were
25 or 30 programmers that had a part in the Part-
ner Info system and between 15 and 20 that were
involved in Campaign. I have touched on a fair
number of the mechanisms we employed to keep
things moving along smoothly but I have not
mentioned one ingredient I believe was crucial to
our efforts, one that tied us together and gave us a
central focus. That ingredient was our common
faith in God and our dependence on His guidance.
We began our meetings with prayer, a number of
us met in small groups each morning to pray
before beginning work, and it was not an uncom-
mon sight to see a programmer sitting at his
computer with his head bowed as he wrestled with
a particularly difficult problem. I didn’t intend for this
message to sound like an evangelical outreach but I
am most confident that we would not have achieved
the success we have without this dependence on God.
Fm: Dick Whetstone<< I really would like a followup on
the search strategy poll. I would also like some <<
clues on getting the kind of results you’ve attained with
SQL. Are we talking single <<file SELECTS or that
kind of performance (several seconds) SELECTIng
fields from <<more than one table. I need to take a
brief survey of the troops before I elaborate. I don’t
recall the precise situation, I just remember that we
backed off of the SQL select in a number of situations
on the big tables and that the SEEK strategy worked
much better. In a somewhat unrelated incident, we
abandoned the SQL SELECT in a number of the
Transaction Engine routines because we were getting
dirty reads in some of the selects. This was not really
a table size issue, it was really more related to the
speed we were seeing in the SELECT. We would do
a select of all the records in a table that met a certain
condition but when we actually got all of the records
into the cursor the last few records no longer met the
criteria. The SQL SELECT process appears to be a
two stage process, it identifies the records first and
then pulls them into the cursor. Not a very good
situation if the the values in the records are subject to
change.<< This has some effect on the CDX bloat that
you’ve experienced since the tags first << have to
account for the initial values and then the updated
values, doesn’t it? << Have you found (or tried &
failed) any strategies to minimize this? I don’t believe
that changing the value of a key field has any effect
on theCDX, aside from the obvious change in the B-
tree pointer. Victor Font’s article in this month’s
FoxPro Advisor seems to agree with this. Changing
the value does cause a great deal of movement in the
tree but the size doesn’t change. We tried to avoid
changing the value too often just to minimize
theamount of movement and there was a small im-
provement in the speed.

NEXT MEETING
Bruce Braunstein, the publisher of the on-disk FoxPro
magazine “FoxMasters”, will be speaking. He was
scheduled to speak last month before the Great
Quake of ‘94. Hopefully, the ground will stay still this
time around!
Page 6

5