Fix UnboundLocalError with plugins and largefiles

When Plugins are used in a repository that contains largefiles,
the following exception is thrown as soon as the first largefile
is converted:

```
Traceback (most recent call last):
  File "fast-export/hg-fast-export.py", line 728, in <module>
    sys.exit(hg2git(options.repourl,m,options.marksfile,options.mappingfile,
  File "fast-export/hg-fast-export.py", line 581, in hg2git
    c=export_commit(ui,repo,rev,old_marks,max,c,authors,branchesmap,
  File "fast-export/hg-fast-export.py", line 366, in export_commit
    export_file_contents(ctx,man,modified,hgtags,fn_encoding,plugins)
  File "fast-export/hg-fast-export.py", line 222, in export_file_contents
    file_data = {'filename':filename,'file_ctx':file_ctx,'data':d}
UnboundLocalError: local variable 'file_ctx' referenced before assignment
```

This commit fixes the error by:

 * initializing the file_ctx before the largefile handling takes place
 * Providing a new `is_largefile` value for plugins so they can detect
    if largefile handling was applied (and therefore the file_ctx
    object may no longer be in sync with the git version of the file)
This commit is contained in:
Günther Nußmüller
2025-07-25 12:28:23 +02:00
parent 95459e5599
commit d77765a23e
6 changed files with 131 additions and 3 deletions

View File

@@ -0,0 +1,18 @@
import sys
from mercurial import node
def build_filter(args):
return Filter(args)
class Filter:
def __init__(self, _):
pass
def file_data_filter(self,file_data):
with open('largefile_info.txt', 'a') as f:
f.write(f"filename: {file_data['filename']}\n")
f.write(f"data size: {len(file_data['data'])} bytes\n")
f.write(f"ctx rev: {file_data['file_ctx'].rev()}\n")
f.write(f"ctx binary: {file_data['file_ctx'].isbinary()}\n")
f.write(f"is largefile: {file_data.get('is_largefile', False)}\n")
f.write("\n")