| Commit message (Collapse) | Author | Age |
| ... | |
| | |
|
| |
|
|
| |
Suggested by SonarQube.
|
| |
|
|
| |
Suggested by SonarQube.
|
| | |
|
| |
|
|
| |
Suggested by SonarQube to avoid magic numbers.
|
| |
|
|
| |
Found by SonarQube.
|
| |
|
|
| |
Found by SonarQube.
|
| |
|
|
| |
Suggested by SonarQube.
|
| |
|
|
| |
Found by SonarQube.
|
| |
|
|
| |
Found with SonarQube.
|
| |
|
|
| |
Suggested by SonarQube to avoid code repetition.
|
| |
|
|
| |
Found with SonarQube.
|
| |
|
|
| |
Found with SonarQube.
|
| |
|
|
| |
Found with SonarQube.
|
| |
|
|
| |
Found with SonarQube.
|
| |
|
|
| |
Fixes #11369.
|
| | |
|
| | |
|
| |
|
|
|
| |
The threading code was never used, because it's broken. No reason to keep
it.
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
| |
Classes implementing DataWriter perform two tasks: update internal status
files and rewrite output document files. These tasks could also be
performed by two separate classes, and there may be cases when we want to
rewrite output document files using already existing internal status
files. As the first step in splitting up these classes, split the
DataWriter interface into two interfaces StatusUpdater and DocumentWriter.
The current classes that implement DataWriter need to implement the two
new interfaces.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Our data writers register at the descriptor source as descriptor listeners
to be notified when new descriptors are parsed. They use these
descriptors to update their internal status files and to know which output
documents need to be rewritten.
We now add a second type of listener, the fingerprint listener, which can
be used to learn relay or bridge fingerprints from new descriptors.
Knowing the fingerprints is sufficient to rewrite output documents,
because their content is only based on internal status files.
This will enable us to later split data writers into status updaters and
document writers.
|
| |
|
|
|
| |
Broken in 558615fc41771c9d7bc31aa3b2403643534de9bc when taking out old
news.
|
| | |
|
| |
|
|
| |
No new content, just trying to flatten the lists.
|
| |
|
|
| |
Prepares for providing monthly relay and bridge statistics (#11041).
|
| |
|
|
| |
Prepares for providing monthly relay and bridge statistics (#11041).
|
| |
|
|
| |
Implements #10331.
|
| |
|
|
|
|
|
|
|
| |
We compress histories by merging adjacent intervals to save disk space,
unless we want intervals to stay distinct. For example, we might want to
compress all intervals on the same UTC day but not want to merge intervals
with the previous or next UTC day. However, we had an off-by-one error
which made us merge the wrong intervals. For example, we merged intervals
from 23:00:00 on one day to 23:00:00 the next day.
|
| |
|
|
|
|
|
|
| |
The specification said that we're only including non-null values for
series of at least two subsequent data points. But that's not true.
We're not including histories if there are not at least two subsequent
non-null values. But those histories may contain single non-null values
with null values previous and next to them.
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Example details document where this produces invalid JSON (extract only):
"""
"or_addresses":["93.114.41.153:9401"],
"country":"ro",
"latitude":,
"longitude":,
"country_name":"Romania",
"as_number":"AS39743",
"as_name":"Voxility S.R.L.",
"""
Lines in GeoLite2-City-Blocks.csv and -Locations.csv:
"""
::ffff:93.114.41.0,121,,798549,,,,,1,0
"""
"""
798549,EU,Europe,RO,Romania,,,,,Europe/Bucharest
"""
|
| |
|
|
| |
Fixes #10908.
|
| | |
|
| | |
|
| | |
|
| |
|
|
| |
Workaround for bug fixed in 427111314eea8a01b43ad30bc10f744a8edcca81.
|
| | |
|
| | |
|
| |
|
|
| |
Implements #10707.
|
| | |
|
| |
|
|
| |
Implements #10619.
|
| |
|
|
| |
Implements Onionoo part of #10523.
|
| | |
|
| |
|
|
|
|
| |
Previously, we wrote out/summary and out/update files to a temp directory.
Avoiding this step makes tests faster and prepares for more sophisticated
tests.
|
| | |
|