Plone

published Nov 03, 2021

This is here to serve as contents for the atom/rss feed for Plone, also read by planet.plone.org.

Jeff Knupp: Keynote: Writing idiomatic Python

published May 09, 2014

Jeff Knupp gives the keynote at Pygrunn: Writing idiomatic Python.

See the PyGrunn website for more info about this one-day Python conference in Groningen, The Netherlands.

Towards comprehensible and maintainable code.

I am author of the Writing Idiomatic Python book.

Idiomatic Python is Pythonic code: code written in the way that the Python community has agreed it should be written.

Who decides this? Python developers, through the code that they write. Occasionally: Guido van Rossum.

The goals are: readability, maintainability and correctness.

"But I am a scientist, why should I write idiomatic code?" You want to be peer reviewed, right? If your code is eye-bleeding bad, you will get peer ignored.

"But I just write scripts." You program in Python.

"But I am coming from Java. Python programmers can still read all my class names that all have Factory in it, right?" Wrong.

Cognitive burden is the increased mental effort required to keep track of what is going on. Don't make me think.

Knuth said that our main task is not instructing computers, but explaining humans what a computer does.

Idiomatic code makes your intentions clear. It does not automatically make it correct. It let's others spot mistakes more easily.

Part of it, is staying up to date with changes in the language. Is there are better way to do it, with a new construct from the standard library or an extra package?

You are the most frequent reader of code you write. Have mercy on your future self. He may be a violent psychopath coming back to slap you in the past.

Artur Barseghyan: Modern authentication in Python web applications

published May 09, 2014

Artur Barseghyan talks about Modern authentication in Python web applications, at PyGrunn.

See the PyGrunn website for more info about this one-day Python conference in Groningen, The Netherlands.

Part 1: Single Sign-on (SSO) using a Central Authentication Service (CAS).

Without SSO you need to login manually to lots more websites or apps, or put everything in one website/app. Or you can create a custom API to share authentication information.

With SSO a user logs in once and gains access to all systems without begin prompted to log in again.

(JaSig) CAS is an enterprise SSO solution. Open source, well documented. Web browser talks to application server, app server talks to CAS, web browser authenticates with CAS once, CAS gives the app a ticket showing that the web browser is authenticated.

See http://jasig.org/cas

Pro: CAS is modular, highly pluggable, so it is fit for lots of frameworks.

Contra: SSO availability and security become critical. You should subscribe to the CAS mailing list to keep on top of issues. Setup notifications for when your CAS server runs into problems.

Our use case:

  • User base in an Active Directory server
  • CAS server
  • Two apps/websites:
    • Dashboard app in Django
    • Document management system in Plone
    • More to come

CAS alternatives: JOSSO, OpenAM, Pubcookie, CoSign.

Part 2: Two-step verification

Or: two-factor authentication. Can be based on:

  • knowledge factor, something only you know
  • possession factor, something only you possess
  • inheritance factor, something only you have, like fingerprints

Common solutions:

  • SMS authentication. You need to pay for each SMS, so it can become expensive. Phone might not be connected.
  • Google Authenticator (for mobile apps). Very easy to integrate in your app. Requires an extra app to be installed on your device.
  • Hardware token generators. Easy to use. But extra device to carry with you.

Software we made for Plone:

Similar software is available for Django.

Alternatives:

  • risk-based authentication, based on behavioral biometrics, keystroke dynamics, etc
  • strong authentication
  • reliance authentication

Python Users Netherlands meeting 15 January 2014

published Jan 15, 2014

Summary of the meeting of the Dutch Python Users group on 15 January 2014.

We were welcome at the offices of Schuberg Philis in Schiphol-rijk for food, drink, presentations, lightning talks and meeting fellow-Pythonistas. Thanks for organizing!

Schuberg Philis is doing managed hosting, with high uptime.

Pini Reznik - Ansible, Docker and Django

Pini Reznik works for Ugly Duckling.

In 1995 you had thick clients and thick servers. Well-defined stack of O/S, runtime, middleware. A monolithic physical structure.

In 2014 you often have a thin app on mobile, tablet. Various web front-ends. You have multiple environments, for example several (parts of) applications and servers running on various versions of Python and different operating systems.

Compare this with lots of different cargo that may need to be moved with trucks, trains, planes, ships, or all of them at some point. Some seventy years ago they switched to containers for shipping goods. You put things inside containers in your factory. A crane only supports one container size: it is simplified. Much easier than fitting a few pianos and barrels of oil without a container.

Common challenges in the pipeline:

  • on development: set up your environment, especially when you first start for a company or a new project
  • testing: a clean environment, with good tests
  • acceptance: is it really similar/equal to production?

So: put your software in containers.

What is Docker? http://docker.io says: Docker is an open-source project to easily create lightweight, portable, self-sufficient containers from any application

We have a java project where every test is run in a separate docker container. This is very quick.

Docker is not a VM (virtual machine). Docker runs directly on the host operating system. A VM has much higher overhead.

Orchestration: defining and deploying multiple environments. Ansible is an orchestration engine. Key concepts:

  • agentless: you only need your certificate or password and you can run on any machine. For most stuff you usually need Python on that machine though.
  • language agnostic: you just need to be able to execute code; the code itself does not need to be Python. It can be a shell script, C, anything that is supported by the target machine.
  • inventory: smart way to manage all the hosts. Inventory is in /etc/ansible/hosts: here you define that servers A and B are webservers, and servers C, D and E are database servers.
  • playbooks: scripts that define hosts, vars, tasks, handlers. What to execute and where to execute it.
  • modules: functionality written by somebody, for example for integration with docker.

Ad-hoc commands are for example:

ansible all -a "reboot"
ansible webservers -a "reboot"

There is a UI around ansible. This is expensive, the rest is free.

Docker + Ansible = software configuration management done right. Everything we need for building our software is now finally in version control.

Demo. If you have a docker called ubuntu this just takes 122 milliseconds:

time docker run ubuntu ls

The docker module for ansible is a bit out of date. Docker is in early development.

Roderick Schaefer - Django as your Backbone

Roderick Schaefer is currently working for Schuberg Philis. See also https://wehandle.it. We do mission critical outsourcing. Say you have a bank and need a guaranteed 100 percent uptime. We can do that. We embrace devops. My devops team works on a project called Connect. We love Python and Django.

Old school web development: Django with MTV/MVC pattern. Model, template/view, controller.

New school: API driven. We use: Python, Django, jQuery, TastyPie (REST framework), Backbone, Require.js, Underscore, Handlebars, Backbone-TastyPie. It just makes sense to work API driven. With APIs you can securely expose databases. The front-ends consume, process and present. Your average phone is faster than your average few-years-old server.

Introduction to TastyPie. This is a tool you hook up to Django to expose your Django models. It can also expose non ORM-related data. Include authorization. Serialization with json.

Backbone. Frontend MVC framework. Uses Underscore.js and friends. Template rendering, routing (note: try to do only routing here, do not corrupt it with other stuff), models, collections, events.

For 'patch' support and file upload to work in Backbone plus TastyPie, you need a few changes in your code. There is form generation with Backbone-Forms.js.

A tip when transitioning to api based development. Wrap your base.html javascripts in a block. Empty that block in your Backbone powered apps. Include some legacy-scripts template from the template of your backbone app, for compatibility with generic stuff like the main navigation.

Challenges are url reversal and session state awareness.

Check http://TodoMVC.com to get familiar with various MVC frameworks, including Backbone.

Look into Backbone SEO if you want to use this for sites where you do not want a single page but want to be indexable by search engines. It is possible.

And now the lightning talks.

Pawel Lewicki - Plone

Working with gw20e in Amsterdam. Plone is a Python CMS. It has everything you want. It may be scary for most of you, but it does not have to be. Try out the installer, it is quite easy. I show a recorded demo. You have content types, workflow states, you can see changes to pages and revert them.

It is Python. Enterprise ready, scales well. Fully functional application. Lots of add-ons if you want those. Very simple from a user perspective, being able to edit in the same environment as visitors see. There is a user manual as a book.

See http://plone.org/

Bahadir Cambel - PythonHackers.com

See http://PythonHackers.com

I use lots of languages, but I like Python. But where to get information? irc, github, bitbucket, svn, python.org, blogs. Come to the python hackers site.

Learn and share with REPL (Read - Evaluate - Print - Loop). Discover, connect, contribute and be awesome.

Find hackers like you. Discover and discuss open source projects. Write, talk, share code. Build applications together. Read and write articles.

A single platform only for Python, with python hackers like you. A bit twitterish, with messages and following, channels, timeline.

What we use technically: Flask plus plugins, memcache, redis, postgres, cassandra, autobahn, fabric, coffeescript. Future: Apache Kafka, some clojure.

Programming is a personal expression.

Douwe van der Meij - Concepts in Django

Working at gw20e. A project may have a crm and a webshop package, with some interconnection. You may want to open source the webshop, but how can you make it work without the crm?

General concept:

  • Producers contain data.
  • Consumers need producers.

For this example we have the concept of a Customer. The crm app produces a Customer. The webshop app consumes a Customer. Define a CustomerConcept class. Define a foreign key from webshop to the CustomerConcept.

I created a concepts Django app. This makes apps reusable without dependencies. The only dependency is a concept. You may choose your own concept producer, different for various clients.

Related work: Zope Component Architecture.

See https://github.com/douwevandermeij/concepts_example

Ilja Heitlager - MicroPython

I am a Kickstarter fan. Information Officer at Schuberg Philis.

Damien George came up with the MicroPython project on Kickstarter.

PyMite was first, Python on a microcontroller.

MicroPython has a dedicated board. Python 3.3. Compiler/code generator on the board. Much faster. Library support, wifi board.

Participate on http://www.micropython.org

Expected: April 2014.

Holger Krekel - re-inventing Python packaging & testing

published May 10, 2013

Holger Krekel gives the keynote at Pygrunn, about re-inventing Python packaging and testing.

See the PyGrunn website for more info about this one-day Python conference in Groningen, The Netherlands.

I am @hpk42 on Twitter.

I started programming in 1984. I am going to tell you how distribution and installation worked that day, as you are too young to know. Me and a friend would sit down after school and take a magazine. One of us would read some hexadecimal numbers from it and the other typed it in. One and a half hour later we could play a pacman game.

Apprentice: "Can anyone tell me why X isn't finished?"

Master: "It takes a long time to write software."

Projects take time. CPython is 22 years old.

Where do all these efforts go into? Into mathematical algorithms? No. Deployment takes a huge bite. Software needs to run on different machines, needs to be configured, tested, packaged, distributed, installed, executed, maintained, monitored, etcetera.

The problem with deployment is the real world. Machines are different, users are different, networks are different, operating systems are different, software versions are different.

There are producers of software. If as a producer I change an API or a UI that creates a danger for my users. This means releasing a new version is dangerous, because for the users deploying the new version is potentially dangerous.

A lot can be solved by automation. Automated tests help. You need to communicate (allow users to track changes, have discussions). Configurations should be versioned so you can go back to earlier versions or at least see what the difference is. You need a packaging system and a deployment system. This may be more important than choosing which language to use.

The modern idea to simplify programming is usually: let's have only one way so it is clear for everyone what to do. Oh, and it should be my way.

Standardization fosters collaboration, even if the standard is not perfect. But tools that come out of this standardization are more important than the standardization document itself.

Are standardized platforms a win? For example 64/Amiga, iOS, Android, Debian, .NET, company wide choices for virtual machines and packaging. This reduces complexity, but increases the lock-in. You may not want to bet your whole business on one platform.

Modernism: have one true order. For example, Principia Mathematica for having one system of mathematics that could do everything. Gödel proved this was impossible.

Let's check the koans of Perl and Python. Perl says there is more than one way to do it. Python says there should be one - and preferably only one - obvious way to do it. Both say there are multiple ways. You need to take that into account.

A note on the Python standard library: Python includes lots of functionality. This was a good idea in the past. Today, PyPI often provides better APIs, and we can still improve it.

Perl has the CPAN, Comprehensive Perl Archive Network. Lots of good structure in there.

Python is still catching up. Python is growing declarative packaging metadata instead of in the Python setup.py file. Trying to standardize on pip and wheels, but easy_install remains a possibility. Uploading or PyPI server interaction today is hard. The server is hard to deploy on a laptop. There are no enforced version semantics. It has a brittle protocol. It is hard to move away from setup.py though.

http://pypi-mirrors.org lists about eight mirror of the official http://pypi.python.org server. Most are not up to date or even not updating at all. Not good.

Perl and Python are both not living up to their koans. Python has lot to improve.

What needs to be improved? setuptools and distribute are being merged. The bandersnatch tool is being deployed, which is much better and faster for mirroring. Several PEPs are being discussed and considered. The people proposing these PEPs are talking to each other, so communication is good. New version comparison, new packaging metadata version, new rules on PyPI, etcetera. A lot is happening.

We should be aware of the standardization trap: you try to solve the five existing ways of doing something by adding a sixth way. To avoid this, don't demand that the world changes first before your tool or idea can be used. To a certain degree Python fell into that trap, but that is outside the scope for this talk.

I would like to focus on integration of meta tools. These can configure and invoke existing tools and make them work for most use cases. You can enable and facilitate new technology there.

Testing

Python has lots of testing tools, like nose, py.test, unittest, unittest2, zope.testing, make test, setup.py test.

tox is a "meta" test running tool. Its mission is to standardize testing in Python. It is a bit like a Makefile. It runs the tests with the tools of your choice. It acts as a front-end to CI servers. See http://tox.testrun.org for details.

travisci (Travis CI) is a "meta" test running service. It configures all kinds of dependencies, priming your environment.

devpi

I have a new idea, devpi: support packaging, testing and deployment. The devpi-server part is a new compatible Python index and upload service. The client part has sub commands for managing release and QA workflows.

Why a new index server? In the existing solutions, I missed an automatically tested extensible code base, or other parts.

devpi-server is self-updating. It is a selective mirror. It does not try to update all packages on the original PyPI, just the ones that you actually use.

But: working with multiple indexes is burdensome. You can use devpi to provide "workflow" subcommands. use to set the current PyPI index. upload to build and upload packages from a checkout. test to download and test a package. So you can create a package, upload it to a local test PyPI, test the package and then upload it to the real PyPI.

I did the last pytest releases using devpi.

Development plans: MIT licensed, test driven development. Get early adopters.

The main messages from this talk:

  • Evolve and build standards, do not impose them.
  • Integrate existing solutions, do not add yet another way, if possible.
  • Let's share this tooling and collaborate. Maybe you have some tool to reliably create a Debian package from a Python package. Make it available and get feedback and code from others.

Strive for something simpler, see the requests library. Simplicity is usually something that emerges by using a piece of software.

Luuk van der Velden - Best practices for the lone coder syndrome

published May 10, 2013

Luuk van der Velden talks about best practices for the lone coder syndrome, at PyGrunn.

See the PyGrunn website for more info about this one-day Python conference in Groningen, The Netherlands.

I do a PhD at the Center for Neuroscience, University of Amsterdam. I switched from Matlab to Python a few years ago. I am a passionate and critical programmer.

Programming is not a substantial part of most science educations, apart from obvious studies like computer science. A lot of experiments in sciences generate more and more data. The demand on computer power and data analysis is becoming bigger.

A PhD student, which we take as example of a lone coder, is responsible for his own project. He or she does the work himself: experiments, analysis. Collaborations do happen, but are asymmetric. I can talk to others, but they usually do not program together with me. Or they pass me some Matlab code that I then have to translate into Python.

A PhD will take about four years, so your code needs to keep running for all that time, maybe longer. Development is continuous.

Cutting corners when working on your own is attractive. You are the only one who uses it, and it works, so why bother improving it for corner cases? High standards demands discipline. So you end up with duplicated code, unreadable code, no documentation, unstructured functionality with no eye for reuse, code rot.

We have a scripting pitfall. Scripting languages like Python are a flexible tool to link different data producing systems, process data and create summaries and figures. Pitfalls for common scripts are: data hiding, hiding of complexity, division of functionality (household and processing), lack of scalability, no handles for code reuse.

What a script for scientific analysis should do, is defining what you want, concisely.

Prototyping is essential for researching a solution. It is used continuously. Consolidation is very different from prototyping. Some things are better left as a prototype.

You should have a hard core of software that is tested well. In your scripts you use this, instead of copying an old full script. 'Soft' code sits between the hard core and the script, as an interface.

As a scientist you did not get educated as a programmer. So you should get educated. And as Python programmers we should educate them. Presently the emphasis is on getting work done, not on programming. Matlab is the default language. This was originally a stripped down version for teaching students, but everyone kept using it. Closed source software goes against scientific ethos.

Python offers a full featured scientific computing stack. Python scales with your skills. You can use imperative code, functional, object oriented or meta programming. Python is free, so you can use the latest version without needing to pay for an upgrade like with Matlab.

We can organize courses and workshops, for example Software Carpentry.