view tests/clean.sh @ 251:a6c9a1c68646

Detect when we've run out of per-file space on insert() and batchinsert(). Exposed by Lute Music/frames1 dataset, we previously corrupted the trackTable and then got a segfault. This happened because the fileTable and trackTable were mmap()ed next to each other, by coincidence, and the lack of overflow checking on the fileTable meant that continued insertion scribbled over the trackTable, which was twice as big (because it has to be at least one memory page in size). The root cause of all this is the --size creation argument, which needs to be split into --nfiles, --datasize and --dimensions, so that the size of all the tables can be computed accurately. No test case yet, because my /bin/sh is currently pointing to dash, which gets about as far as line 6 of run-tests.sh before giving up. (We need either to fix bashisms or to run /bin/bash explicitly.)
author mas01cr
date Mon, 31 Mar 2008 11:52:59 +0000
parents 1853beeb0521
children
line wrap: on
line source
#! /bin/sh

for file in [0-9][0-9][0-9][0-9]*; do
  if [ -d ${file} ]; then
    echo Cleaning ${file}
    rm -f ${file}/test*
    if [ -f ${file}/clean.sh ]; then
      (cd ${file} && sh ./clean.sh)
    fi
  fi
done