X-Git-Url: https://fleuret.org/cgi-bin/gitweb/gitweb.cgi?p=profiler-torch.git;a=blobdiff_plain;f=README.md;h=2adb72c321a7a7a3b6199bbf26643a91cc8b930c;hp=2bb3011ee342309d6aa2c9ff1075eaa63ac96499;hb=e927faab65fb190dc01959236c07df46f3d28946;hpb=4ed0c97b543f69caac452b7775a5ca57d017637a diff --git a/README.md b/README.md index 2bb3011..2adb72c 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,8 @@ This is a simple profiler to estimate processing time per module per function. +It seems to work okay, but there was no heavy testing so far. See test-profiler.lua for a short example. + ### profiler.decorate(model, [functionsToDecorate]) ### This function should be called before starting the computation. @@ -9,12 +11,6 @@ It replaces functions specified in functionsToDecorate by instrumented versions It also resets the accumulated timings to zero. -### profiler.print(model, [nbSamples]) ### - -Prints the measured processing times. If nbSamples is provided, the time per samples will also be printed. - -It seems to work okay, but there was no heavy testing so far. +### profiler.print(model, [nbSamples], [totalTime]) ### --- -Francois Fleuret -Dec 5th, 2016 +Prints the measured processing times. If nbSamples is provided, the time per samples will also be printed. If totalTime is provided, the percentages will also be printed.