[Pkg-privacy-commits] [Git][pkg-privacy-team/mat2][upstream] New upstream version 0.12.3
Georg Faerber (@georg)
georg at debian.org
Tue Mar 15 14:50:02 GMT 2022
Georg Faerber pushed to branch upstream at Privacy Maintainers / mat2
Commits:
bf6926fd by Georg Faerber at 2022-03-15T14:16:31+00:00
New upstream version 0.12.3
- - - - -
17 changed files:
- .gitlab-ci.yml
- CHANGELOG.md
- CONTRIBUTING.md
- README.md
- doc/mat2.1
- dolphin/mat2.desktop
- libmat2/archive.py
- libmat2/audio.py
- libmat2/epub.py
- libmat2/images.py
- libmat2/office.py
- libmat2/video.py
- mat2
- nautilus/mat2.py
- setup.py
- + tests/fuzz.py
- tests/test_libmat2.py
Changes:
=====================================
.gitlab-ci.yml
=====================================
@@ -31,9 +31,9 @@ linting:pylint:
image: $CONTAINER_REGISTRY:linting
stage: linting
script:
- - pylint --disable=no-else-return,no-else-raise,no-else-continue,unnecessary-comprehension,raise-missing-from,unsubscriptable-object --extension-pkg-whitelist=cairo,gi ./libmat2 ./mat2
+ - pylint --disable=no-else-return,no-else-raise,no-else-continue,unnecessary-comprehension,raise-missing-from,unsubscriptable-object,use-dict-literal,unspecified-encoding,consider-using-f-string,use-list-literal,too-many-statements --extension-pkg-whitelist=cairo,gi ./libmat2 ./mat2
# Once nautilus-python is in Debian, decomment it form the line below
- - pylint --disable=no-else-return,no-else-raise,no-else-continue,unnecessary-comprehension,raise-missing-from,unsubscriptable-object --extension-pkg-whitelist=Nautilus,GObject,Gtk,Gio,GLib,gi ./nautilus/mat2.py
+ - pylint --disable=no-else-return,no-else-raise,no-else-continue,unnecessary-comprehension,raise-missing-from,unsubscriptable-object,use-list-literal --extension-pkg-whitelist=Nautilus,GObject,Gtk,Gio,GLib,gi ./nautilus/mat2.py
linting:pyflakes:
image: $CONTAINER_REGISTRY:linting
@@ -56,9 +56,11 @@ tests:archlinux:
tests:debian:
image: $CONTAINER_REGISTRY:debian
stage: test
+ <<: *prepare_env
script:
- apt-get -qqy purge bubblewrap
- - python3 -m unittest discover -v
+ - su - mat2 -c "python3-coverage run --branch -m unittest discover -s tests/"
+ - su - mat2 -c "python3-coverage report --fail-under=95 -m --include 'libmat2/*'"
tests:debian_with_bubblewrap:
image: $CONTAINER_REGISTRY:debian
@@ -66,8 +68,8 @@ tests:debian_with_bubblewrap:
allow_failure: true
<<: *prepare_env
script:
- - su - mat2 -c "python3-coverage run --branch -m unittest discover -s tests/"
- - su - mat2 -c "python3-coverage report --fail-under=95 -m --include 'libmat2/*'"
+ - apt-get -qqy install bubblewrap
+ - python3 -m unittest discover -v
tests:fedora:
image: $CONTAINER_REGISTRY:fedora
=====================================
CHANGELOG.md
=====================================
@@ -1,3 +1,11 @@
+# 0.12.3 - 2022-01-06
+
+- Implement code for internationalization
+- Keep individual files compression type in zip files
+- Increase the robustness of mat2 against weird/corrupted files
+- Fix the dolphin integration
+- Add a fuzzer
+
# 0.12.2 - 2021-08-29
- Add support for aiff files
=====================================
CONTRIBUTING.md
=====================================
@@ -4,8 +4,14 @@ The main repository for mat2 is on [0xacab]( https://0xacab.org/jvoisin/mat2 ),
but you can send patches to jvoisin by [email](https://dustri.org/) if you prefer.
Do feel free to pick up [an issue]( https://0xacab.org/jvoisin/mat2/issues )
-and to send a pull-request. Please do check that everything is fine by running the
-testsuite with `python3 -m unittest discover -v` before submitting one :)
+and to send a pull-request.
+
+Before sending the pull-request, please do check that everything is fine by
+running the full test suite in GitLab. To do that, after forking mat2 in GitLab,
+you need to go in Settings -> CI/CD -> Runner and there enable shared runners.
+
+Mat2 also has unit tests (that are also run in the full test suite). You can run
+them with `python3 -m unittest discover -v`.
If you're fixing a bug or adding a new feature, please add tests accordingly,
this will greatly improve the odds of your merge-request getting merged.
=====================================
README.md
=====================================
@@ -6,9 +6,6 @@
```
-This software is currently in **beta**, please don't use it for anything
-critical.
-
# Metadata and privacy
Metadata consist of information that characterizes data.
=====================================
doc/mat2.1
=====================================
@@ -1,4 +1,4 @@
-.TH mat2 "1" "August 2021" "mat2 0.12.2" "User Commands"
+.TH mat2 "1" "January 2022" "mat2 0.12.3" "User Commands"
.SH NAME
mat2 \- the metadata anonymisation toolkit 2
=====================================
dolphin/mat2.desktop
=====================================
@@ -9,5 +9,5 @@ Name=Clean metadata
Name[de]=Metadaten löschen
Name[es]=Limpiar metadatos
Icon=/usr/share/icons/hicolor/scalable/apps/mat2.svg
-Exec=kdialog --yesno "$( mat2 -s %U )" --title "Clean Metadata?" && mat2 %U
-Exec[de]=kdialog --yesno "$( mat2 -s %U )" --title "Metadaten löschen?" && mat2 %U
+Exec=kdialog --yesno "$( mat2 -s %F )" --title "Clean Metadata?" && mat2 %U
+Exec[de]=kdialog --yesno "$( mat2 -s %F )" --title "Metadaten löschen?" && mat2 %U
=====================================
libmat2/archive.py
=====================================
@@ -120,6 +120,18 @@ class ArchiveBasedAbstractParser(abstract.AbstractParser):
# pylint: disable=unused-argument
return member
+ @staticmethod
+ def _get_member_compression(member: ArchiveMember):
+ """Get the compression of the archive member."""
+ # pylint: disable=unused-argument
+ return None
+
+ @staticmethod
+ def _set_member_compression(member: ArchiveMember, compression) -> ArchiveMember:
+ """Set the compression of the archive member."""
+ # pylint: disable=unused-argument
+ return member
+
def get_meta(self) -> Dict[str, Union[str, dict]]:
meta = dict() # type: Dict[str, Union[str, dict]]
@@ -184,6 +196,8 @@ class ArchiveBasedAbstractParser(abstract.AbstractParser):
original_permissions = os.stat(full_path).st_mode
os.chmod(full_path, original_permissions | stat.S_IWUSR | stat.S_IRUSR)
+ original_compression = self._get_member_compression(item)
+
if self._specific_cleanup(full_path) is False:
logging.warning("Something went wrong during deep cleaning of %s",
member_name)
@@ -223,6 +237,7 @@ class ArchiveBasedAbstractParser(abstract.AbstractParser):
zinfo = self.member_class(member_name) # type: ignore
zinfo = self._set_member_permissions(zinfo, original_permissions)
+ zinfo = self._set_member_compression(zinfo, original_compression)
clean_zinfo = self._clean_member(zinfo)
self._add_file_to_archive(zout, clean_zinfo, full_path)
@@ -368,12 +383,12 @@ class ZipParser(ArchiveBasedAbstractParser):
super().__init__(filename)
self.archive_class = zipfile.ZipFile
self.member_class = zipfile.ZipInfo
- self.zip_compression_type = zipfile.ZIP_DEFLATED
def is_archive_valid(self):
try:
- zipfile.ZipFile(self.filename)
- except zipfile.BadZipFile:
+ with zipfile.ZipFile(self.filename):
+ pass
+ except (zipfile.BadZipFile, OSError):
raise ValueError
@staticmethod
@@ -409,7 +424,7 @@ class ZipParser(ArchiveBasedAbstractParser):
assert isinstance(member, zipfile.ZipInfo) # please mypy
with open(full_path, 'rb') as f:
archive.writestr(member, f.read(),
- compress_type=self.zip_compression_type)
+ compress_type=member.compress_type)
@staticmethod
def _get_all_members(archive: ArchiveClass) -> List[ArchiveMember]:
@@ -420,3 +435,14 @@ class ZipParser(ArchiveBasedAbstractParser):
def _get_member_name(member: ArchiveMember) -> str:
assert isinstance(member, zipfile.ZipInfo) # please mypy
return member.filename
+
+ @staticmethod
+ def _get_member_compression(member: ArchiveMember):
+ assert isinstance(member, zipfile.ZipInfo) # please mypy
+ return member.compress_type
+
+ @staticmethod
+ def _set_member_compression(member: ArchiveMember, compression) -> ArchiveMember:
+ assert isinstance(member, zipfile.ZipInfo) # please mypy
+ member.compress_type = compression
+ return member
=====================================
libmat2/audio.py
=====================================
@@ -13,21 +13,25 @@ class MutagenParser(abstract.AbstractParser):
def __init__(self, filename):
super().__init__(filename)
try:
- mutagen.File(self.filename)
+ if mutagen.File(self.filename) is None:
+ raise ValueError
except mutagen.MutagenError:
raise ValueError
def get_meta(self) -> Dict[str, Union[str, dict]]:
f = mutagen.File(self.filename)
if f.tags:
- return {k:', '.join(v) for k, v in f.tags.items()}
+ return {k:', '.join(map(str, v)) for k, v in f.tags.items()}
return {}
def remove_all(self) -> bool:
shutil.copy(self.filename, self.output_filename)
f = mutagen.File(self.output_filename)
- f.delete()
- f.save()
+ try:
+ f.delete()
+ f.save()
+ except mutagen.MutagenError:
+ raise ValueError
return True
@@ -40,6 +44,9 @@ class MP3Parser(MutagenParser):
if not meta:
return metadata
for key in meta:
+ if isinstance(key, tuple):
+ metadata[key[0]] = key[1]
+ continue
if not hasattr(meta[key], 'text'): # pragma: no cover
continue
metadata[key.rstrip(' \t\r\n\0')] = ', '.join(map(str, meta[key].text))
=====================================
libmat2/epub.py
=====================================
@@ -108,7 +108,7 @@ class EPUBParser(archive.ZipParser):
item.append(uniqid)
# items without mandatory content
- for name in {'language', 'title'}:
+ for name in ['language', 'title']:
uniqid = ET.Element(self.metadata_namespace + name)
item.append(uniqid)
break # there is only a single <metadata> block
=====================================
libmat2/images.py
=====================================
@@ -26,7 +26,10 @@ class SVGParser(exiftool.ExiftoolParser):
}
def remove_all(self) -> bool:
- svg = Rsvg.Handle.new_from_file(self.filename)
+ try:
+ svg = Rsvg.Handle.new_from_file(self.filename)
+ except GLib.GError:
+ raise ValueError
dimensions = svg.get_dimensions()
surface = cairo.SVGSurface(self.output_filename,
dimensions.height,
=====================================
libmat2/office.py
=====================================
@@ -179,7 +179,7 @@ class MSOfficeParser(ZipParser):
return False
# rsid, tags or attributes, are always under the `w` namespace
- if 'w' not in namespace.keys():
+ if 'w' not in namespace:
return True
parent_map = {c:p for p in tree.iter() for c in p}
@@ -215,7 +215,7 @@ class MSOfficeParser(ZipParser):
return False
# The nsid tag is always under the `w` namespace
- if 'w' not in namespace.keys():
+ if 'w' not in namespace:
return True
parent_map = {c:p for p in tree.iter() for c in p}
@@ -328,7 +328,7 @@ class MSOfficeParser(ZipParser):
logging.error("Unable to parse %s: %s", full_path, e)
return False
- if 'p14' not in namespace.keys():
+ if 'p14' not in namespace:
return True # pragma: no cover
for item in tree.iterfind('.//p14:creationId', namespace):
@@ -344,7 +344,7 @@ class MSOfficeParser(ZipParser):
logging.error("Unable to parse %s: %s", full_path, e)
return False
- if 'p' not in namespace.keys():
+ if 'p' not in namespace:
return True # pragma: no cover
for item in tree.iterfind('.//p:sldMasterId', namespace):
@@ -486,7 +486,7 @@ class LibreOfficeParser(ZipParser):
logging.error("Unable to parse %s: %s", full_path, e)
return False
- if 'office' not in namespace.keys(): # no revisions in the current file
+ if 'office' not in namespace: # no revisions in the current file
return True
for text in tree.getroot().iterfind('.//office:text', namespace):
=====================================
libmat2/video.py
=====================================
@@ -50,7 +50,7 @@ class AbstractFFmpegParser(exiftool.ExiftoolParser):
ret = dict() # type: Dict[str, Union[str, dict]]
for key, value in meta.items():
- if key in self.meta_key_value_allowlist.keys():
+ if key in self.meta_key_value_allowlist:
if value == self.meta_key_value_allowlist[key]:
continue
ret[key] = value
=====================================
mat2
=====================================
@@ -13,11 +13,11 @@ import concurrent.futures
try:
from libmat2 import parser_factory, UNSUPPORTED_EXTENSIONS
from libmat2 import check_dependencies, UnknownMemberPolicy
-except ValueError as e:
- print(e)
+except ValueError as ex:
+ print(ex)
sys.exit(1)
-__version__ = '0.12.2'
+__version__ = '0.12.3'
# Make pyflakes happy
assert Set
=====================================
nautilus/mat2.py
=====================================
@@ -16,6 +16,7 @@ import queue
import threading
from typing import Tuple, Optional, List
from urllib.parse import unquote
+import gettext
import gi
gi.require_version('Nautilus', '3.0')
@@ -25,6 +26,8 @@ from gi.repository import Nautilus, GObject, Gtk, Gio, GLib, GdkPixbuf
from libmat2 import parser_factory
+_ = gettext.gettext
+
def _remove_metadata(fpath) -> Tuple[bool, Optional[str]]:
""" This is a simple wrapper around libmat2, because it's
@@ -51,11 +54,11 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
self.infobar.set_show_close_button(True)
self.infobar_hbox = Gtk.Box(orientation=Gtk.Orientation.HORIZONTAL)
- btn = Gtk.Button("Show")
+ btn = Gtk.Button(_("Show"))
btn.connect("clicked", self.__cb_show_failed)
self.infobar_hbox.pack_end(btn, False, False, 0)
- infobar_msg = Gtk.Label("Failed to clean some items")
+ infobar_msg = Gtk.Label(_("Failed to clean some items"))
self.infobar_hbox.pack_start(infobar_msg, False, False, 0)
self.infobar.get_content_area().pack_start(self.infobar_hbox, True, True, 0)
@@ -90,9 +93,9 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
window = Gtk.Window()
headerbar = Gtk.HeaderBar()
window.set_titlebar(headerbar)
- headerbar.props.title = "Metadata removal failed"
+ headerbar.props.title = _("Metadata removal failed")
- close_buton = Gtk.Button("Close")
+ close_buton = Gtk.Button(_("Close"))
close_buton.connect("clicked", lambda _: window.close())
headerbar.pack_end(close_buton)
@@ -107,9 +110,9 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
""" Validate if a given file FileInfo `fileinfo` can be processed.
Returns a boolean, and a textreason why"""
if fileinfo.get_uri_scheme() != "file" or fileinfo.is_directory():
- return False, "Not a file"
+ return False, _("Not a file")
elif not fileinfo.can_write():
- return False, "Not writeable"
+ return False, _("Not writeable")
return True, ""
def __create_treeview(self) -> Gtk.TreeView:
@@ -120,7 +123,7 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
column_pixbuf = Gtk.TreeViewColumn("Icon", renderer_pixbuf, pixbuf=0)
treeview.append_column(column_pixbuf)
- for idx, name in enumerate(['File', 'Reason']):
+ for idx, name in enumerate([_('File'), _('Reason')]):
renderer_text = Gtk.CellRendererText()
column_text = Gtk.TreeViewColumn(name, renderer_text, text=idx+1)
treeview.append_column(column_text)
@@ -180,7 +183,7 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
return False
progressbar.pulse()
- progressbar.set_text("Cleaning %s" % fname)
+ progressbar.set_text(_("Cleaning %s") % fname)
progressbar.show_all()
self.infobar_hbox.show_all()
self.infobar.show_all()
@@ -202,7 +205,7 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
fpath = unquote(fileinfo.get_uri()[7:]) # `len('file://') = 7`
success, mtype = _remove_metadata(fpath)
if not success:
- self.failed_items.append((fname, mtype, 'Unsupported/invalid'))
+ self.failed_items.append((fname, mtype, _('Unsupported/invalid')))
processing_queue.put(None) # signal that we processed all the files
return True
@@ -236,8 +239,8 @@ class Mat2Extension(GObject.GObject, Nautilus.MenuProvider, Nautilus.LocationWid
item = Nautilus.MenuItem(
name="mat2::Remove_metadata",
- label="Remove metadata",
- tip="Remove metadata"
+ label=_("Remove metadata"),
+ tip=_("Remove metadata")
)
item.connect('activate', self.__cb_menu_activate, files)
=====================================
setup.py
=====================================
@@ -5,7 +5,7 @@ with open("README.md", encoding='utf-8') as fh:
setuptools.setup(
name="mat2",
- version='0.12.2',
+ version='0.12.3',
author="Julien (jvoisin) Voisin",
author_email="julien.voisin+mat2 at dustri.org",
description="A handy tool to trash your metadata",
=====================================
tests/fuzz.py
=====================================
@@ -0,0 +1,54 @@
+import mimetypes
+import os
+import sys
+
+sys.path.append('..')
+
+import atheris
+
+with atheris.instrument_imports(enable_loader_override=False):
+ from libmat2 import parser_factory, UNSUPPORTED_EXTENSIONS
+
+extensions = set()
+for parser in parser_factory._get_parsers(): # type: ignore
+ for mtype in parser.mimetypes:
+ if mtype.startswith('video'):
+ continue
+ if 'aif' in mtype:
+ continue
+ if 'wav' in mtype:
+ continue
+ if 'gif' in mtype:
+ continue
+ if 'aifc' in mtype:
+ continue
+ for extension in mimetypes.guess_all_extensions(mtype):
+ if extension not in UNSUPPORTED_EXTENSIONS:
+ extensions.add(extension)
+extensions = list(extensions)
+
+
+
+def TestOneInput(data):
+ fdp = atheris.FuzzedDataProvider(data)
+ extension = fdp.PickValueInList(extensions)
+ data = fdp.ConsumeBytes(sys.maxsize)
+
+ fname = '/tmp/mat2_fuzz' + extension
+
+ with open(fname, 'wb') as f:
+ f.write(data)
+ try:
+ p, _ = parser_factory.get_parser(fname)
+ if p:
+ p.sandbox = False
+ p.get_meta()
+ p.remove_all()
+ p, _ = parser_factory.get_parser(fname)
+ p.get_meta()
+ except ValueError:
+ pass
+ os.remove(fname)
+
+atheris.Setup(sys.argv, TestOneInput)
+atheris.Fuzz()
=====================================
tests/test_libmat2.py
=====================================
@@ -175,14 +175,30 @@ class TestGetMeta(unittest.TestCase):
def test_zip(self):
with zipfile.ZipFile('./tests/data/dirty.zip', 'w') as zout:
- zout.write('./tests/data/dirty.flac')
- zout.write('./tests/data/dirty.docx')
- zout.write('./tests/data/dirty.jpg')
+ zout.write('./tests/data/dirty.flac',
+ compress_type = zipfile.ZIP_STORED)
+ zout.write('./tests/data/dirty.docx',
+ compress_type = zipfile.ZIP_DEFLATED)
+ zout.write('./tests/data/dirty.jpg',
+ compress_type = zipfile.ZIP_BZIP2)
+ zout.write('./tests/data/dirty.txt',
+ compress_type = zipfile.ZIP_LZMA)
p, mimetype = parser_factory.get_parser('./tests/data/dirty.zip')
self.assertEqual(mimetype, 'application/zip')
meta = p.get_meta()
self.assertEqual(meta['tests/data/dirty.flac']['comments'], 'Thank you for using MAT !')
self.assertEqual(meta['tests/data/dirty.docx']['word/media/image1.png']['Comment'], 'This is a comment, be careful!')
+
+ with zipfile.ZipFile('./tests/data/dirty.zip') as zipin:
+ members = {
+ 'tests/data/dirty.flac' : zipfile.ZIP_STORED,
+ 'tests/data/dirty.docx': zipfile.ZIP_DEFLATED,
+ 'tests/data/dirty.jpg' : zipfile.ZIP_BZIP2,
+ 'tests/data/dirty.txt' : zipfile.ZIP_LZMA,
+ }
+ for k, v in members.items():
+ self.assertEqual(zipin.getinfo(k).compress_type, v)
+
os.remove('./tests/data/dirty.zip')
def test_wmv(self):
@@ -413,7 +429,7 @@ class TestCleaning(unittest.TestCase):
'name': 'gif',
'parser': images.GIFParser,
'meta': {'Comment': 'this is a test comment'},
- 'expected_meta': {},
+ 'expected_meta': {'TransparentColor': '5'},
},{
'name': 'css',
'parser': web.CSSParser,
@@ -515,9 +531,11 @@ class TestCleaning(unittest.TestCase):
self.assertTrue(p1.remove_all())
p2 = case['parser'](p1.output_filename)
- for k, v in p2.get_meta().items():
- self.assertIn(k, case['expected_meta'])
- self.assertIn(str(case['expected_meta'][k]), str(v))
+ meta = p2.get_meta()
+ if meta:
+ for k, v in p2.get_meta().items():
+ self.assertIn(k, case['expected_meta'], '"%s" is not in "%s" (%s)' % (k, case['expected_meta'], case['name']))
+ self.assertIn(str(case['expected_meta'][k]), str(v))
self.assertTrue(p2.remove_all())
os.remove(target)
@@ -595,9 +613,14 @@ class TestCleaning(unittest.TestCase):
class TestCleaningArchives(unittest.TestCase):
def test_zip(self):
with zipfile.ZipFile('./tests/data/dirty.zip', 'w') as zout:
- zout.write('./tests/data/dirty.flac')
- zout.write('./tests/data/dirty.docx')
- zout.write('./tests/data/dirty.jpg')
+ zout.write('./tests/data/dirty.flac',
+ compress_type = zipfile.ZIP_STORED)
+ zout.write('./tests/data/dirty.docx',
+ compress_type = zipfile.ZIP_DEFLATED)
+ zout.write('./tests/data/dirty.jpg',
+ compress_type = zipfile.ZIP_BZIP2)
+ zout.write('./tests/data/dirty.txt',
+ compress_type = zipfile.ZIP_LZMA)
p = archive.ZipParser('./tests/data/dirty.zip')
meta = p.get_meta()
self.assertEqual(meta['tests/data/dirty.docx']['word/media/image1.png']['Comment'], 'This is a comment, be careful!')
@@ -609,6 +632,16 @@ class TestCleaningArchives(unittest.TestCase):
self.assertEqual(p.get_meta(), {})
self.assertTrue(p.remove_all())
+ with zipfile.ZipFile('./tests/data/dirty.zip') as zipin:
+ members = {
+ 'tests/data/dirty.flac' : zipfile.ZIP_STORED,
+ 'tests/data/dirty.docx': zipfile.ZIP_DEFLATED,
+ 'tests/data/dirty.jpg' : zipfile.ZIP_BZIP2,
+ 'tests/data/dirty.txt' : zipfile.ZIP_LZMA,
+ }
+ for k, v in members.items():
+ self.assertEqual(zipin.getinfo(k).compress_type, v)
+
os.remove('./tests/data/dirty.zip')
os.remove('./tests/data/dirty.cleaned.zip')
os.remove('./tests/data/dirty.cleaned.cleaned.zip')
View it on GitLab: https://salsa.debian.org/pkg-privacy-team/mat2/-/commit/bf6926fd96b4ad2efdbb2c11ed042e83a353230e
--
View it on GitLab: https://salsa.debian.org/pkg-privacy-team/mat2/-/commit/bf6926fd96b4ad2efdbb2c11ed042e83a353230e
You're receiving this email because of your account on salsa.debian.org.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://alioth-lists.debian.net/pipermail/pkg-privacy-commits/attachments/20220315/a2894a17/attachment-0001.htm>
More information about the Pkg-privacy-commits
mailing list