Archive

Archive for the ‘FOSS’ Category

ownCloud and CryFS

August 17, 2019 4 comments

It is a great idea to encrypt files on client side before uploading them to an ownCloud server if that one is not running in controlled environment, or if one just wants to act defensive and minimize risk.

Some people think it is a great idea to include the functionality in the sync client.

I don’t agree because it combines two very complex topics into one code base and makes the code difficult to maintain. The risk is high to end up with a kind of code base which nobody is able to maintain properly any more. So let’s better avoid that for ownCloud and look for alternatives.

A good way is to use a so called encrypted overlay filesystem and let ownCloud sync the encrypted files. The downside is that you can not use the encrypted files in the web interface because it can not decrypt the files easily. To me, that is not overly important because I want to sync files between different clients, which probably is the most common usecase.

Encrypted overlay filesystems put the encrypted data in one directory called the cipher directory. A decrypted representation of the data is mounted to a different directory, in which the user works.

That is easy to setup and use, and also in principle good to use with file sync software like ownCloud because it does not store the files in one huge container file that needs to be synced if one bit changes as other solutions do.

To use it, the cypher directory must be configured as local sync dir of the client. If a file is changed in the mounted dir, the overlay file system changes the crypto files in the cypher dir. These are synced by the ownCloud client.

One of the solutions I tried is CryFS. It works nicely in general, but is unfortunately very slow together with ownCloud sync.

The reason for that is that CryFS is chunking all files in the cypher dir into 16 kB blocks, which are spread over a set of directories. It is very beneficial because file names and sizes are not reconstructable in the cypher dir, but it hits on one of the weak sides of the ownCloud sync. ownCloud is traditionally a bit slow with many small files spread over many directories. That shows dramatically in a test with CryFS: Adding eleven new files with a overall size of around 45 MB to a CryFS filesystem directory makes the ownCloud client upload for 6:30 minutes.

Adding another four files with a total size of a bit over 1MB results in an upload of 130 files and directories, with an overall size of 1.1 MB.

A typical change use case like changing an existing office text document locally is not that bad. CryFS splits a 8,2 kB big LibreOffice text doc into three 16 kB files in three directories here. When one word gets inserted, CryFS needs to create three new dirs in the cypher dir and uploads four new 16 kB blocks.

My personal conclusion: CryFS is an interesting project. It has a nice integration in the KDE desktop with Plasma Vault. Splitting files into equal sized blocks is good because it does not allow to guess data based on names and sizes. However, for syncing with ownCloud, it is not the best partner.

If there is a way how to improve the situation, I would be eager to learn. Maybe the size of the blocks can be expanded, or the number of directories limited?
Also the upcoming ownCloud sync client version 2.6.0 again has optimizations in the discovery and propagation of changes, I am sure that improves the situation.

Let’s see what other alternatives can be found.

Categories: FOSS, KDE, ownCloud Tags: , , ,

Eighty Percent ownCloud

December 23, 2018 25 comments

Recently the German computer magazin C’t posted an article about file sync solutions (“Unter eigener Regie”, C’t 23, 2018) with native sync clients. The article was pretty positive about the FOSS solution of… Nextcloud! I was wondering why they had not choosen ownCloud’s client as my feeling is that ownCloud is way more busy and innovative developing the desktop client for file synchronization together with community.

lines_changed

Code lines changed as of Nov. 10, 2018

That motivated me to do some investigation what the Nextcloud client actually consists of (at due date Nov. 10, 2018). I was looking into the NC desktop client git repoository grouped the numbers of commits of people that can be associated clearly to either the ownCloud- or Nextcloud project, or to “other communities” or machine commits. Since the number of commits could be misleading (maybe some commits are huge?) I did the same exercise with numbers of changed lines of code.

When looking on the changed lines, the first top six contributors to the Nextcloud desktop client are only active in the ownCloud project. Number seven is an “other community” contributor whos project the client was based on in the beginning. Number eight to eleven go to Nextcloud, with a low percentage figure.

commits

# of commits to the Nextcloud Desktop repository as of Nov. 10, 2018

As a result, far more than 80% of the changed lines of the Nextcloud client is actually work that ownClouders did (not considering the machine commits). In the past, and also today. The number would be even higher if it considered all the commits that go into the NC repo with an NC author, but are actually ownCloud patches where the original author got lost on the way by merging them through a NC branch. It looks like the Nextcloud developers were actually adding less commits to their client than all “other community” developers so far.

No wonder, it is a fork, you might think, and that is of course true. However, to my taste these numbers are not reflecting a “constructive” fork driving things forward when we talk about sync technology.

That is all fine, and I am proud that the work we do in ownCloud is actually stimulating two projects, with different focus areas nowadays. On the other hand, I would appreciate if the users of the technology would take a closer look to understand who really innovates, drives things forward and also fixes the nasty bugs in the stack. As a matter of fairness, that should be acknowledged. That is the motivation that keeps free software contributors busy and communities proud.

Change in Professional Life

November 23, 2018 2 comments

This November was very exciting for me so far, as I was starting a new job at a company called Heidolph. I left SUSE after working there for another two years. My role there that was pretty far away from interesting technical work, which I missed more and more, so I decided to grab the opportunity to join in a new adventure.

Heidolph is a mature German engineering company building premium laboratory equipment. It is based in Schwabach, Germany. For me it is the first time that I am working in company that doesn’t do only software. At Heidolph, software is just one building block besides mechanical and electronic parts and tons of special know how. That is a very different situation and a lot to learn for me, but in a small, co-located team of great engineers, I am able to catch up fast in this interesting area.

We build software for the next generation Heidolph devices based on Linux and C++/Qt. Both technologies are in the center of my interest, over the years it has become more than clear for me that I want to continue with that and deepen my knowledge even more.

Since the meaning of open source has changed a lot since I started to contribute to free software and KDE in particular, it was a noticeable but not difficult step for me to take and move away from a self-proclaimed open source company towards a company that is using open source technologies as one part of their tooling and is
interested in learning about the processes we do in open source to build great products. An exciting move for me where I will learn a lot but also benefit from my experience. This of course that does not mean that I will stop to contribute to open source projects.

We are still building up the team and look for a Software Quality Engineer. If you are interested in working with us in an exciting environment, you might wanna get in touch.

Categories: FOSS, KDE, Opinion, Qt Tags: ,

Kraft Version 0.82

October 19, 2018 5 comments

A new release of Kraft, the Qt- and KDE based software to help to organize business docs in small companies, has arrived.

A couple of days ago version 0.82 was released. It mainly is a bugfix release, but it also comes with a few new features. Users were asking for some new functions that they needed to switch to Kraft with their business communication, and I am always trying to make that a priority.

The most visible feature is a light rework of the calculation dialog that allows users to do price calculations for templates. It was cleared up, superflous elements were finally removed and the remaining ones now work as expected. The distinction between manual price and calculated price should be even more clear now. Time calculations can now not only done in the granularity of minutes, as this was to coarse for certain usecases. The unit for a time slice can now be either seconds, minutes or hours.

Kraft 0.82

New calculation dialog in 0.82

Apart from that, for example sending documents per email was fixed, and in addition to doing it through thunderbird, Kraft can now also utilize the xdg-email tool to work with the desktop standard mail client, such as KMail.

Quite a few more bugfixes make this a nice release. Check the full Changelog! Update is recommended.

Thanks for your comments or suggestions about Kraft!

Categories: FOSS, KDE, Kraft, Release Tags: , , ,

Kraft out of KDE

March 22, 2018 13 comments

Following my last blog about Krafts upcoming release 0.80 I got a lot of positive reactions.

There was one reaction however, that puzzles me a bit and I want to share my thoughts here. It is about a comment about my announcement that I prefer to continue to develop Kraft on Github. The commenter reminded my friendly that there is still Kraft code on KDE infrastructure, and that switching to a different repository might waste peoples time when they work with the KDE repo.

That is a fair statement, of course I don’t want to waste peoples time. What sounds a bit strange to me is the second paragraph, that says that if I decide to stay with Github, I should let KDE people know that I wish Kraft to not be a KDE project anymore.

But … I never felt that Kraft should not be a KDE project any more.

A little History

Kraft has come a long way together with KDE. I started Kraft in (probably) 2004, gave a talk about Kraft at the Akademy Dublin 2006, maintained it with the best effort I could contribute until today. There is a small but loyal community around Kraft.

During all the time I got little substancial contribution to the code directly, with the exception of one cool developer who got interested for some time and made some very interesting contributions.

When I asked a for the subdomain http://kraft.kde.org long time ago I got the reply that it is not in the interest of KDE to give every little project a subdomain. As a result I reserved http://volle-kraft-voraus.de and run it since then, happily showing a “Part of the KDE family” logo on it.

Beside the indirect contributions to libraries that Kraft uses, I shipped Kraft with the translations made by the KDE i18n team, for which I always was very grateful. Otherwise I got no other services from KDE.

Why Github?

Githubs workflow serves me well in my day job, and since I have only little time for Kraft, I like to use the tools that I know best and give me the most efficiency.

I know that Github is not free software and I am sceptical about that. But Github also does not lock in, as we still are on git. We all know the arguments that usually come on the table at this point, so I am not elaborating here. One thing I want to mention though is that since I moved to Github publically I already got two little pull requests with code contributions. That is a lot compared to what came in the last twelfe years when living on KDE infrastructure only.

Summary

Kraft is a small project, driven by me alone. My development turnaround is good with Github as I am used to it. Even if no KDE developer would ever look at Github (which I know is not true) I have to say with heavy heart that Kraft would not take big harm by leaving KDEs infra, based on the experience of the last 12 years.

If the KDE translation teams do not want to work with Github, I am fine to accept that, and wonder if there could be a solution rather than switching to Transifex.

One point however I like to make very clear: I did not wish to leave KDE, nor aimed to move Kraft out.
I still have friends in the KDE community, I am still very interested in free software on desktop and elsewhere, and my opinion is still that KDE is the best around.

If the KDE community feels that Kraft must not be a KDE project any longer because it is on Github, ok. I asked KDE Sysadmins to remove Kraft from the KDE git, and it is already done.

Kraft now lifes on on Github.

Categories: FOSS, KDE, Kraft, Opinion, Qt Tags: , ,

SMB on openSUSE Conference

May 21, 2017 1 comment

The annual openSUSE Conference 2017 is upcoming! osc17finalNext weekend it will be again in the Z-Bau in Nuremberg, Germany.

The conference program is impressive and if you can make it, you should consider stopping by.

Stefan Schäfer from the Invis server project and me will organize a workshop about openSUSE for Small and Medium Business (SMB).

SMB is a long running concern of the heart of the two of us: Both Stefan, who even does it for living, and me have both used openSUSE in the area of SMB for long and we know how well it serves there. Stefan has even initiated the Invis Server Project, which is completely free software and builds on top of the openSUSE distributions. The Invis Server adds a whole bunch of extra functionality to openSUSE that is extremely useful in the special SMB usecase. It came a long way starting as Stefans own project long years ago, evolving as proper maintained openSUSE Spin in OBS with a small, but active community.

The interesting question is how openSUSE, Invis Server and other smaller projects like for example Kraft can unite and offer a reliable maintained and comprehensive solution for this huge group of potential users, that is now locked in to proprietary technologies mainly while FOSS can really make a difference here.

In the workshop we first will introduce the existing projects briefly, maybe discuss some technical questions like integration of new packages in the openSUSE distributions and such, and also touch organizational question like how we want to setup and market openSUSE SMB.

Participants in the workshop should not expect too much presentation. We rather hope for a lively discussion with many people bringing in their projects that might fit, their experiences and ideas. Don’t be shy 🙂

 

 

Raspberry based Private Cloud?

December 11, 2016 15 comments

Here is something that might be a little outdated already, but I hope it still adds some interesting thoughts. The rainy Sunday afternoon today finally gives the opportunity to write this little blog.

Recently an ownCloud fork was coming up with a little shiny box with one harddisk, that can be complemented with a Rapsberry Pi and their software, promoting that as your private cloud.

While I like the idea of building a private cloud for everybody (I started to work on ownCloud because of that idea back in the days), I do not think that this example of gear is a good solution for private cloud.

In fact I believe that throwing this kind of implementations on the table is especially unfortunate because if we come up with too many not optimal proposals, we waste the  willingness of users to try it. This idea should not target geeks who might be willing to try ideas on and on. The idea of the private cloud needs to target at every computer user who wants to store data safely, but does not want to care about longer than ever necessary. And with them I fear we only have very little chances, if one at all, to introduce them to a private cloud solution before they go back to something that simply works.

Here are some points why I think solutions like the proposed one are not good enough:

Hardware

That is nothing new: The hardware of the Raspberry Pi was not designed for this kind of usecases. It is simply too weak to drive ownCloud, which is an PHP app plus database server that has some requirements on the servers power. Even with PHP7, which is faster, and the latest revisions of the mini computer, it might look ok in the beginning, but after all the neccessary bells and whistles were added to the installation and data run in, it will turn out that the CPU power is simply not enough. Similar weaknesses are also true for the networking capabilities for example.

A user that finds that out after a couple of weeks after she worked with the system will remain angry and probably go (back) to solutions that we do not fancy.

One Disk Setup

The solution comes as one disk setup: How secure can data be that is on one single hardisk? A seriously engineered solution should at least recommend a way to store the data more securely and/or backup, like on an at homes NAS for example.
That can be done, but requires manual work and might require more network capabilities and CPU power.

Advanced Networking

Last, but for me the most important point: Having such a box in the private network requires to drill a whole in the firewall, to allow port forwarding. I know, that is nothing unusual for experienced people, and in theory little problem.

But for people who are not so interested, that means they need to click in the interface of their router on a button that they do not understand what it does, and maybe even insert data by following an documentation that they have to believe. (That is not very much different from downloading a script from somewhere letting it do the changes which I would not recommend as well).
Doing mistakes here could potentially have a huge impact for the network behind the router, without that the person who did it even has an understanding for.

Also DynDNS is needed: That is also not a big problem in theory and for geeks, but in practice it is nothing easily done.

With a good solution for private cloud, it should not be necessary to ask for that kind of setups.

Where to go from here?

There should be better ways to solve this problems with ownCloud, and I am sure ownCloud is the right tool to solve that problem. I will share some thought experiments that we were doing some time back to foster discussion on how we can use the Raspberry Pi with ownCloud (because it is a very attractive piece of hardware) and solve the problems.

This will be subject of an upcoming blog here, please stay tuned.

 

Categories: FOSS, Opinion, ownCloud Tags: ,