*Dr. McLaughlin is a member of the technical staff, Air Transportation Systems, for Mitre Corp. He can be reached at 1740 Westwind Way, McLean, VA 22102.*

- You randomize the starting configuration.
- If possible, you find a better configuration by making repeated (usually random) changes to the current configuration.
- Make this better configuration the current one, then go back to step 2.
- If a better configuration is not found after a reasonable amount of effort, conclude that you have obtained a local optimum. Save it in the variable OPTIMUM.
- Repeat from step 1, replacing OPTIMUM if appropriate, until your patience or computing resources are exhausted. Hope that the final value of OPTIMUM is global.

The deficiencies of the greedy algorithm are not obscure. Imagine a mountain goat traipsing about the Rocky Mountains looking for the deepest valley (where the best grass is). This goat is very hungry (greedy?) and, consequently, never takes an upward step. Unless the goat starts out on a smooth slope of the deepest valley, it will never reach that valley but will, instead, come to a dead end in some higher valley, that is, in a local minimum, which may be much higher than the global minimum. Moreover, a space as rough as the Rockies contains thousands of valleys, some only a few feet deep, and repeating the exercise (step 5, above) may yield improvement but probably not the best result.

The difficulties of combinatorial optimization can be illustrated in the game of poker solitaire where the object is to rearrange the 25 cards of the tableau, like that in Figure 1, so that the 12 hands of straight poker formed from the 5 rows, 5 columns, and 2 diagonals give the highest total score. Various scoring schemes are popular, but I shall use the one listed in Table 1 in which each hand of a pair or better has a value inversely proportional to its probability in a standard deck, normalized to that of a single pair. All hands of a given type (for example, full house) are considered equal, and an ace can be high or low.

Hand Score --------------------------- Straight flush 27,456 Four of a kind 1,760 Full house 293 Flush 215 Straight 108 Three of a kind 20 Two pair 9 Pair 1

The tableau in Figure 1 contains 7 pairs for a score of 7, although there are obviously 4 aces and 4 kings, so it is reasonable to expect that a good score would be at least 4000. Finding a really high score is not easy, however. The 12 poker hands are tightly interlocked, and even a small change in configuration can result in a drastic change in score. This is another way of saying that the configuration space is very rough. Informal experiments have shown it to be more than a match for a human player, and it provides a good test for any combinatorial optimization technique.

In simulated annealing, score is associated with energy such that minimizing the energy means optimizing the score. In the poker solitaire example, high score signifies low energy. The tableau in Figure 1 can be thought of as a collection of "atoms" frozen into a configuration that has too much energy. Proceeding with this analogy, the tableau may be subjected to simulated annealing by raising its "temperature" until it is almost "melted," then cooling it slowly until it is again "frozen." Although the connection between real and simulated annealing appears to be little more than a metaphor, this system (25 playing cards) responds in the same way as does a physical system -- and with the same results.

In any physical system, the average </entry> energy, <E>, of the constituent atoms is a function of the absolute temperature, T. As the temperature of a system decreases, the average energy of its atoms also decreases. A typical cooling curve is shown in Figure 2. The slope of this curve, d<E>/dT, is the heat capacity, C, of the system. In regions where the curve is steep, the heat capacity is high. This usually indicates a phase change -- water into ice, for example. Quite often, these regions also display large values of C/T. This quantity is equal to dS/dT, the rate of change of entropy, S, with temperature, which is related to a change in the orderliness of the atoms. Therefore, regions of a cooling curve that are steep usually imply a significant reconfiguration of the atoms (or molecules) at the corresponding temperatures. In an annealing process where temperature is being regulated, you would have to slow down the cooling in such regions in order to allow the atoms time to rearrange themselves and to permit their average energy to decrease gradually. Annealing works best when this is done.

Simulated annealing also produces cooling curves much like that of Figure 2. In fact, the literature of simulated annealing demonstrates that these analogies run very deep and it is not at all easy to decide just where to draw the line between metaphor and methodology.

At any temperature, there are generally more atoms at lower energies than at higher energies. The Boltzmann distribution, (see Example 1, equation 1) describes the relative number of atoms (or molecules) populating any two energy states, E_{hi} > E_{lo}, at some temperature, T, where N_{x} is the number of atoms with energy E_{x}, T is the absolute temperature (degrees Kelvin), and k is Boltzmann's constant. When equation 1 is adapted to the problem of combinatorial optimization, the constant is ignored, energy is equated to score (cost), and the left-hand side of equation 1 is interpreted as a probability. Simulated annealing uses such probabilities to direct a stochastic search of the configuration space by employing equation 2, Example 1 where S_{x} is a score and T is a scaling factor.

Equation 1: N_{hi}/N_{lo}= exp ((E_{lo}-E_{hi})/kT) Equation 2: Prob (accept worse score) = exp ((S_{lo}-S_{hi})/T)

As equation 2 suggests, choosing a worse configuration is a function of temperature. The higher the temperature, the higher the "energy" of the system. The higher the energy of the system, the more readily it will do the work of moving in an unfavorable direction -- just like real atoms and molecules. As the temperature decreases, however, any given expenditure of energy becomes less likely. Consequently, simulated annealing automatically adapts to the magnitude of the features of a good configuration. The most profitable features are found first and the smaller, more subtle, features later. Starting with the tableau in Figure 1, simulated annealing first finds the four kings and four aces. If the temperature is very high, these features may be forgotten from time to time, but as the temperature decreases, eventually they are frozen into every acceptable tableau. As the temperature continues to fall, hands of lower and lower scores are successively frozen until, finally, the entire tableau is frozen. Simulated annealing is terminated at this point.

It is clear that the annealing schedule -- the program for reducing the temperature -- is critical; a large drop in temperature at the wrong time may freeze in undesirable features. (Ideally, all reconfigurations should be reversible.) The annealing schedule answers the following four questions:

- 1. What should be the value of the starting temperature?

- 2. When should the temperature be changed?

- 3. How should the temperature be changed?

- 4. When should the search be stopped?

A quick-and-dirty procedure is to pick a starting temperature high enough so that virtually all scores are deemed acceptable, hold that and each successive temperature for (100*SIZE) reconfigurations or (10*SIZE) successful reconfigurations, whichever comes first, then decrease the temperature by a constant factor -- for example, T(new) = 0.9*T(old) -- stopping when T reaches some very low value (say, 0.1). Such a schedule is often employed for an exploratory run on a new problem in order to determine the shape of the cooling curve. Given the cooling curve, you can then create a proper annealing schedule by arranging for the temperature to decrease most slowly where the curve is steepest.

The ANNEAL.C program (see Listing One) addresses both the poker solitaire problem and computes the average and standard deviation of the scores accepted at any temperature. These statistics help in answering the questions posed above.

To choose a starting temperature, begin by picking one that seems high relative to the scores expected and run the program for that temperature only. There should be very few reconfigurations rejected. After equilibrium is reached at this temperature (see below), statistics will be output, one of which is SIGMA, the standard deviation of the acceptable scores. As the temperature approaches infinity, SIGMA approaches SIGMA(inf), the standard deviation of a large set of random configurations. Set the starting temperature equal to 20*SIGMA(inf).

There is no point in trying further reconfigurations at a given temperature once a system has reached thermal equilibrium. (The method used here for recognizing equilibrium is adapted from a procedure published by Huang, Romeo, and Sangiovanni-Vincentelli.)

ANNEAL.C keeps track of the number of reconfigurations accepted (nsuccesses) and rejected (nfailures) at a given temperature as well as the number of acceptable scores less than or more than SIGMA/2 from the average acceptable score (incount and outcount, respectively). The code in Listing Two is then used to decide whether or not equilibrium has been reached. (Uppercase variables are parameters.)

Once equilibrium has been established, the temperature is decreased, provided that the current configuration has a score no worse than SIGMA/2 from the average score. The temperature is not changed until this is true. The idea is to avoid transmitting a poor outlier to a new, lower temperature. As shown, there is no explanation for delaying the transition to a lower temperature.

The simplest and most flexible method for computing a new temperature is to use a table, matched to the cooling curve, listing the desired sequence of temperature ratios, T(new)/T(old), and the temperatures at which they take effect. Typically, these ratios fall in the range (0.5, 1.0). ANNEAL.C permits up to ten values for these ratios in addition to the initial value. Table 2 gives the actual ratios used in this example.</entry>

Temperature t_ratio Less Than ------------------------- -- 0.7 360 0.8 215 0.7 85 0.8 60 0.9 30 0.95 15 0.9 7 0.8 3 0.7 0 0.7

In ANNEAL.C, a giant step consists of interchanging 2 of the 12 hands of the tableau, selected at random from a uniform distribution. A baby step, on the other hand, is implemented using a "rotation." A random integer in the range [2,5] is chosen from an exponential distribution (see Table 3). This number of cards is then selected from the tableau using a uniform distribution. The first card selected is temporarily set aside and the resulting hole filled with card number 2, and so on. The last hole is filled with card number 1. The ratio of giant steps to baby steps is a parameter, exg_cutoff, here set equal to 0.2. Using only giant steps or baby steps generally gives poor performance.</entry>

# of Cards Probability to Rotate ----------------------------- 2 0.33333 3 0.27018 4 0.21899 5 0.17750

The simulated annealing algorithm was carried out 30 times, starting with the tableau in Figure 1 each time, using a random-number generator with a period much longer than that required by the experiment. The greedy algorithm was then applied to the same tableau for the same amount of CPU time. Apart from the rule for accepting or rejecting a new configuration, the two programs are essentially identical. In particular, all the relevant parameter values are the same (see A.DAT, Listing Three). The best result from either technique is a score of 4516 (see Figure 3). This apparent optimum, found once by the greedy algorithm and twice by simulated annealing, is degenerate (that is, there are several tableaux with this score, which are not symmetry-related). The distribution of results for simulated annealing is shown in Figure 4 and that for the greedy algorithm in Figure 5.

The number of iterations used to generate Figure 4 and Figure 5 should be enough to give an approximate indication of the average performance of the two techniques. No serious attempt was made to find the combination of parameters that would optimize the simulated annealing results because this would have constituted a bias against the greedy algorithm and such a combination would have applied too specifically to this particular initial tableau. The parameter values in A.DAT appear to be adequate for the poker solitaire problem in general although the annealing schedule will have to be altered when the cards are changed, especially if the new initial tableau can produce a straight flush (see accompanying box). For other combinatorial optimization problems, both the parameter values and the annealing schedule will have to be changed. The big questions, of course, are, "Does simulated annealing work?" and "Is it worth the effort?" Taken together, the results shown in Figure 4 and Figure 5 suggest that simulated annealing is definitely worth the effort. Although you could argue that, given equal amounts of computation time, both techniques find what appears to be the global optimum, this would be overly simplistic. For stochastic techniques, the important criterion is their average behavior.</entry>

For this example, the probability that any given iteration of the greedy algorithm yields a score greater than or equal to the average simulated annealing result is 6/1583. Thus, 183 iterations of the greedy algorithm afford a 50 percent chance of doing at least as well as simulated annealing (607 iterations for a 90 percent chance). The greedy algorithm, however, converges only 53 times faster in this example than does simulated annealing. Clearly, there is a higher payoff with simulated annealing.

Finally, both algorithms described in this article are stochastic in nature. Even though the tableau in Figure 1 was subjected to hundreds of simulated annealing runs, there is still no guarantee that the optimum found is, in fact, the global optimum. It's possible that there exists a tableau with a score greater than 4516 but that the path through the configuration space leading to it is so narrow that even the millions of reconfigurations carried out are insufficient to find it.

In this regard, it's interesting that all the tableaus found having scores of 4516 contained only 11 scoring hands, not 12. This being the case, there may be "room" for an even higher score. Better yet, what if a 13th hand were defined, consisting of the 4 corners plus the center card. Would there be any point in adding the extra degree of freedom? This intriguing thought is left as an exercise for DDJ readers.

Nash, L.K. Elements of Statistical Thermodynamics. Reading, Mass.: Addison-Wesley, 1968.

uni(z) -- returns a double-precision float in the range (O,z) via rand1( ).

initialize( ) -- reads in the starting data from A.DAT and sets up the data structures, and so on.

check_equilibrium( ) -- uses the code given in the article to detect the equilibrium point.

selectwr( ) -- is a general routine to perform N selections without replacement.

report( ) -- produces the output statistics and the final tableaux.

T_RATIO -- T(new)/T(old). The initial value (here, 0.7) and 10 other values are listed in A.DAT.

EXG_CUTOFF -- is the ratio Prob (giant step)/Prob(baby step).

Jean F. Coppola and Dr. Francis T. Marchese

_SIMULATED ANNEALING_
by Michael P. McLaughlin
**[LISTING ONE]**

/* ANNEAL.C -- Author: Dr. Michael P. McLaughlin */ #include <stdio.h> #include <math.h> #define ABS(x) ((x<0)?(-(x)):(x)) #define BOOLEAN int #define TRUE 1 #define FALSE 0 #define FORWARD 1 #define BACK 0 struct cardtype { char card[3]; int code; } cards[25]; float ratios[10][2]; int tableau[25],best_tableau[25]; float temperature,best_temperature,t_low,t_min,t_ratio; long seed,score,best_score,tot_successes,tot_failures,scor,exg_cutoff, report_time,report_interval,modulus = 2147483399,step = 1; int next; BOOLEAN equilibrium,frozen = FALSE,quench=FALSE,final=FALSE; /* variables needed to assess equilibrium */ float totscore,totscore2,avg_score,sigma,half_sigma,score_limit; long bscore,wscore,max_change,nsuccesses,nfailures,scale, incount,outcount; int incount_limit,outcount_limit,succ_min,first_limit,ultimate_limit; /* poker hands in tableau */ int hand[12][5] = {0,1,2,3,4, /* rows */ 5,6,7,8,9, 10,11,12,13,14, 15,16,17,18,19, 20,21,22,23,24, 0,5,10,15,20, /* columns */ 1,6,11,16,21, 2,7,12,17,22, 3,8,13,18,23, 4,9,14,19,24, 0,6,12,18,24, /* diagonals */ 4,8,12,16,20}; /* 0,4,12,20,24 = corners */ long rand1() /* Get uniform 31-bit integer -- Ref. CACM, June 1988, pg. 742 */ { register long k; k = seed/52774; seed = 40692*(seed-k*52774)-k*3791; if (seed < 0) seed += modulus; return(seed); } double uni(z) int z; { return((double) z*rand1()/modulus); } void initialize() /* Set up entire simulated annealing run. */ { char label[20]; double exgc; register i; FILE *fp; fp = fopen("a.dat","r"); fscanf(fp,"%f %s\n",&temperature,label); fscanf(fp,"%f %s\n",&t_low,label); fscanf(fp,"%f %s\n",&t_min,label); fscanf(fp,"%f %s\n",&avg_score,label); fscanf(fp,"%ld %s\n",&scale,label); fscanf(fp,"%f %s\n",&t_ratio,label); fscanf(fp,"%f %s\n",&sigma,label); fscanf(fp,"%lf %s\n",&exgc,label); fscanf(fp,"%d %s\n",&succ_min,label); fscanf(fp,"%d %s\n",&incount_limit,label); fscanf(fp,"%d %s\n",&outcount_limit,label); fscanf(fp,"%d %s\n",&first_limit,label); fscanf(fp,"%d %s\n",&ultimate_limit,label); fscanf(fp,"%ld %s\n",&report_interval,label); fscanf(fp,"%ld %s\n",&seed,label); half_sigma = 0.5*sigma; report_time = report_interval; score_limit = 20; exg_cutoff = modulus*exgc; for (i=0;i<10;i++) fscanf(fp,"%f %f\n",&ratios[i][0],&ratios[i][1]); for (i=0;i<25;i++) { fscanf(fp,"%d %s\n",&cards[i].code,cards[i].card); tableau[i] = cards[i].code; } fclose(fp); printf(" TOTAL TOTAL CURRENT AVERAGE\n"); printf("STEP TEMPERATURE SUCCESSES FAILURES SCORE SCORE "); printf(" SIGMA\n\n"); } void init() /* set up for new temperature */ { nsuccesses = 0; nfailures = 0; equilibrium = FALSE; incount = 0; outcount = 0; bscore = 0; wscore = 100000; max_change = 0; totscore = 0; totscore2 = 0; if (temperature < t_min) frozen = TRUE; } void get_new_temperature() { if (temperature < ratios[next][0]) { t_ratio = ratios[next][1]; next++; } temperature = t_ratio*temperature; } void check_equilibrium() /* Determine whether equilibrium has been reached. */ { if ((nsuccesses+nfailures) > ultimate_limit) equilibrium = TRUE; else if (nsuccesses >= succ_min) { if (incount > incount_limit) equilibrium = TRUE; else { if (outcount > outcount_limit) { if (nsuccesses > first_limit) equilibrium = TRUE; else { incount = 0; outcount = 0; } } } } } void update() /* Compute statistics, etc. */ { float ascore,s; if (nsuccesses > 0) { ascore = totscore/nsuccesses; avg_score = ascore+scale; s = totscore2/nsuccesses-ascore*ascore; if (s > 0.0) { sigma = sqrt(s); half_sigma = 0.5*sigma; score_limit = avg_score-half_sigma; } } if ((temperature < t_low)&&((bscore-wscore) == max_change)) frozen = TRUE; } void selectwr(array,mode,nchoices) /* Select from array without replacement. */ int *array,mode,nchoices; { int i,temp,size,index; size = mode==1 ? 12 : 25; for (i=0;i<nchoices;i++) { index = uni(size); temp = array[--size]; array[size] = array[index]; array[index] = temp; } } void reconfigure(direction) /* If direction is FORWARD, make a new move; if it is BACK, restore the tableau to its previous configuration. */ int direction; { static long partition[6] = {0,0,715827799,1296031795,1766307855, 2147483400}; static int hindex[12] = {0,1,2,3,4,5,6,7,8,9,10,11}; static int cindex[25] = {0,1,2,3,4,5,6,7,8,9,10,11,12,13, 14,15,16,17,18,19,20,21,22,23,24}; static int overlap[66] = {-1,-1,-1,-1,-1,-1,-1,-1,-1,-1,0,5,10,15, 20,1,6,11,16,21,-1,2,7,12,17,22,-1,-1,3, 8,13,18,23,-1,-1,-1,4,9,14,19,24,-1,-1,-1, -1,0,6,12,18,24,0,6,12,18,24,4,8,12,16,20, 20,16,12,8,4,12}; static int mode,nchoices,common,last; /* remember changes */ int temp,hi,lo,i,j,k; if (direction == FORWARD) { if (rand1() < exg_cutoff) { /* giant step */ mode = 1; nchoices = 2; selectwr(hindex,mode,nchoices); hi = hindex[11]; lo = hindex[10]; /* order hand indices */ if (hi < lo) { hi = lo; lo = hindex[11]; } common = overlap[hi*(hi-1)/2+lo]; /* triangular matrix */ } else { /* baby step */ mode = 0; nchoices = 2; rand1(); /* How many cards to rotate? */ while (seed > partition[nchoices]) nchoices++; selectwr(cindex,mode,nchoices); temp = tableau[cindex[24]]; /* rotate forward */ for (i=1,j=24;i<nchoices;i++,j--) tableau[cindex[j]] = tableau[cindex[j-1]]; tableau[cindex[j]] = temp; last = j; } } if (mode == 1) { /* swapping hands is symmetrical */ register first,second,c; first = hindex[11]; second = hindex[10]; k = (common<0) ? 5 : 4; for (c=0,i=0,j=0;c<k;c++,i++,j++) { if (hand[first][i] == common) i++; if (hand[second][j] == common) j++; temp = tableau[hand[first][i]]; tableau[hand[first][i]] = tableau[hand[second][j]]; tableau[hand[second][j]] = temp; } } else if (direction == BACK) { /* rotation is not */ temp = tableau[cindex[last]]; for (i=1,j=last;i<nchoices;i++,j++) tableau[cindex[j]] = tableau[cindex[j+1]]; tableau[cindex[24]] = temp; } } long get_score(crds) /* Return score of one hand. */ int *crds; { static long scores[18] = {0,1,1,20,1,9,20,1760,1,9,9, 293,20,293,1760,108,215,27456}; int flush,i,index = 0; flush = crds[0]&crds[1]&crds[2]&crds[3]&crds[4]&0xff00; for (i=0;i<5;i++) crds[i] = crds[i]&0xff; /* ignore suits */ { register b,k,t; /* sort cards from high to low */ for (b=4;b>0;b--) for (k=0;k<b;k++) if (crds[k] < crds[k+1]) { t = crds[k]; crds[k] = crds[k+1]; crds[k+1] = t; } } if (!flush) { /* multiple cards ? */ if (crds[0] == crds[1]) index += 8; if (crds[1] == crds[2]) index += 4; if (crds[2] == crds[3]) index += 2; if (crds[3] == crds[4]) index += 1; } if (!index) { /* straight ? */ if (((crds[0]-crds[4]) == 4)||((crds[0] == 14)&&(crds[1] == 5))) index = 15; if (flush) index = (index == 15) ? 17 : 16; } return(scores[index]); } long evaluate1() /* Return total score of tableau (no corners). */ { int temp[5],h,c; long scr = 0; for (h=0;h<12;h++) { /* h<13 for corners */ for (c=0;c<5;c++) temp[c] = tableau[hand[h][c]]; scr += get_score(temp); } return(scr); } void decide() /* Either accept new configuration or restore old configuration */ { float pdt1,pdt2,sscor; long s; BOOLEAN acceptable = FALSE; scor = evaluate1(); if (scor >= score) { if (scor > best_score) { register i; best_score = scor; best_temperature = temperature; for (i=0;i<25;i++) best_tableau[i] = tableau[i]; } acceptable = TRUE; } else if (temperature > 0.0) { /* uphill movement? */ if (uni(1) < exp((scor-score)/temperature)) /* Equation 2 */ acceptable = TRUE; } if (acceptable) { /* statistics, etc. */ s = ABS(score-scor); if (s > max_change) max_change = s; score = scor; sscor = scor-scale; /* to aid precision of sigma */ totscore += sscor; totscore2 += sscor*sscor; nsuccesses++; if (ABS(avg_score-score) < half_sigma) incount++; else outcount++; if (score > bscore) /* maximization */ bscore = score; else if (score < wscore) wscore = score; } else { /* unacceptable */ reconfigure(BACK); nfailures++; } } void report() { tot_successes += nsuccesses; tot_failures += nfailures; printf("%3ld %10.1f %9ld %9ld %7ld %7.0f %5.1f\n",step,temperature, tot_successes,tot_failures,score,avg_score,sigma); report_time += report_interval; if (frozen) { int temp[25],k,kk; register i,j; BOOLEAN ok; if (final) printf("\nFINAL -- "); else if (quench) printf("\nQUENCH -- "); else printf("\nFROZEN -- "); printf("BEST SCORE = %5ld BEST TEMPERATURE = %5.1f\n\n", best_score,best_temperature); for (i=0;i<25;i++) { k = best_tableau[i]; j = 0; ok = FALSE; while (!ok) { if (cards[j].code == k) { temp[i] = j; ok = TRUE; } else j++; } } for (i=0;i<25;i+=5) { for (j=i;j<(i+5);j++) { k = 4-strlen(cards[temp[j]].card); for (kk=0;kk<k;kk++) printf(" "); printf("%s",cards[temp[j]].card); } printf("\n"); } printf("\n"); } } void try_new_temperature() { init(); while (!equilibrium) { reconfigure(FORWARD); decide(); check_equilibrium(); } update(); if (!quench) { /* ratchet */ while ((float) score < score_limit) { /* maximization */ reconfigure(FORWARD); decide(); } } } main() { initialize(); while (!frozen) { try_new_temperature(); update(); if ((step == report_time)||(frozen)) report(); step++; if (temperature > 0.0) get_new_temperature(); } /* Quench frozen configuration. */ temperature = 0; quench = TRUE; try_new_temperature(); update(); report(); step++; /* Quench best configuration (rarely useful). */ { register i; final = TRUE; for (i=0;i<25;i++) tableau[i] = best_tableau[i]; } try_new_temperature(); update(); report(); printf("SEED = %ld\n",seed); /* seed of next run */ }

if ((nsuccesses+nfailures) > ULTIMATE_LIMIT) equilibrium = true; else if (nsuccesses >= SUCC_MIN) { if (incount > INCOUNT_LIMIT) equilibrium = true; else { if (outcount > OUTCOUNT_LIMIT) { if (nsuccesses > FIRST_LIMIT) equilibrium = true; else { incount = 0; outcount = 0; } } } }

2150 temperature 1.5 t_low 0.1 t_min 40 avg_score 4400 scale 0.7 t_ratio 150 sigma 0.2 exg_cutoff 200 succ_min 250 incount_limit 400 outcount_limit 2700 first_limit 10000 ultimate_limit 1 report_interval 1234567890 seed 360 0.8 215 0.7 85 0.8 60 0.9 30 0.95 15 0.9 7 0.8 3 0.7 0 0.7 0 0.7 1032 8H 2061 KS 270 AC 526 AD 522 10D 1029 5H 265 9C 1035 JH 1037 KH 525 KD 515 3D 263 7C 2058 10S 2062 AS 518 6D 1036 QH 267 JC 259 3C 2053 5S 269 KC 520 8D 1028 4H 1038 AH 519 7D 1026 2H [EXAMPLE 1] Equation 1: N /su/hi//N /su/lo/ = exp((E lo-E hi)/kT) Equation 2: Prob(accept worse score) = exp((S /su/lo/ -S /su/hi/)/T) Example 1: Equations that describe the Boltzman distribution

Copyright © 1989, *Dr. Dobb's Journal*