Skip to content
Snippets Groups Projects
Select Git revision
  • master default protected
  • dd_upload_processor_cfg_file_update
  • dailybuild_win32
  • dailybuild_linux-x64
  • sqlite
  • rip_abstraction
  • dailybuild_macos-armv8
  • dd_file_lister_filanem_in_desc_color
  • mode7
  • dd_msg_reader_are_you_there_warning_improvement
  • c23-playing
  • syncterm-1.3
  • syncterm-1.2
  • test-build
  • hide_remote_connection_with_telgate
  • 638-can-t-control-c-during-a-file-search
  • add_body_to_pager_email
  • mingw32-build
  • cryptlib-3.4.7
  • ree/mastermind
  • sbbs320d
  • syncterm-1.6
  • syncterm-1.5
  • syncterm-1.4
  • sbbs320b
  • syncterm-1.3
  • syncterm-1.2
  • syncterm-1.2rc6
  • syncterm-1.2rc5
  • push
  • syncterm-1.2rc4
  • syncterm-1.2rc2
  • syncterm-1.2rc1
  • sbbs319b
  • sbbs318b
  • goodbuild_linux-x64_Sep-01-2020
  • goodbuild_win32_Sep-01-2020
  • goodbuild_linux-x64_Aug-31-2020
  • goodbuild_win32_Aug-31-2020
  • goodbuild_win32_Aug-30-2020
40 results

api

  • Clone with SSH
  • Clone with HTTPS
  • Rob Swindell's avatar
    Rob Swindell authored
    This appears to fix the "lots of user.dat files open concurrently" behavior
    that remains when a browser/client has the webv4 page open for a long
    duration.
    
    Eventually, the User objects created in this loop would be cleaned (and the
    user.dat file descriptors closed), but the default garbage collection interval
    (configurable) is 1000 calls to the CommonOperationCallback function. This could
    be as as long as 1000 times through this loop before GC occurs.
    
    Since this is not a performance sensitive loop (we have a call to sleep for a
    full second) just force a garbage collection for each loop iteration.
    d07ae18a
    History
    Name Last commit Last update
    ..