puppet-php
                                
                                 puppet-php copied to clipboard
                                
                                    puppet-php copied to clipboard
                            
                            
                            
                        Bc Layer / deprecation warnings
in two of my PR the discussion came up to not add any bc layers or deprecation warnings. i would vote for a deprecation warnings, semver, bc layer and not simply increasing the major version most of the time.
- https://github.com/voxpupuli/puppet-php/pull/436
- https://github.com/voxpupuli/puppet-php/pull/435
chrome and firefox made a step in the wrong direction, each release is an increase of the major version. this led to relevant addons not working any more, workflows have to be changed, people who could not work for a day, ...
i come from symfony, a php framework which keeps up best practices and which is enterprise ready. they have a roadmap with exact timeframes when each version is retired and which version is a LTS version. http://symfony.com/doc/current/contributing/community/releases.html#schedule
it is quite strange that many puppet parts and modules have no real roadmap or a quite short support cycle.
- https://puppet.com/misc/puppet-enterprise-lifecycle
- https://puppet.com/products/capabilities/platform-support-lifecycle
- https://ask.puppet.com/question/2654/what-are-the-major-eol-dates-for-puppetlabs-software/
- have not found any roadmap for the community version
often modules have dependencies very new other modules, which sometimes require even a higher version of puppet. i switched to puppet because i want less work and not to have to invest a lot of time to keep it up and running. often a simple boolean flag old_behavior which is true per default can save many hours of work. for the developer it is only 15min to add such a flag but for all the users of the module it saves them from updating the module and testing it with their infrastructure.
most of the time puppet modules simply increase the major version, maybe we should rethink this and add bc layers (well documented, so we can remove them afterwards). the way symfony is doing it is really cool. for example version 3.4 is nearly identical with 4.0 the versions where released at the same time, the only difference is that 4.0 has all deprecations made from 3.x to 4.x where removed.
Hi,
- It's somewhere documented that the open source version of Puppet aligns the support lifecycle with the Puppet versions shipped in Puppet Enterprise. I know it's hard to find :(
- At the moment all of the Vox Pupuli modules should work support oldest from Puppet Inc. supported version, which is 4.10. Also 99% of our modules work on Puppet. Anybody that runs an older version version should really consider upgrading their env, because it's heavily outdated.
- Since the introduction of AIO packages it should be easy for most people to upgrade from time to time.
- We take our versioning very seriously. We carefully review our pull requests to detect breaking changes, new features and bugfixes. Based on those infos we decide about the next version number. We try to honor semantic versioning and not every release is a breaking change. We don't backport fixes to older releases. It's just not possible with the few people we have to keep track of that. Often the pullrequests we get mix features and bugfixes, cherrypicking the fixes isn't often possible. That's nothing we, as a community that works on this in their free time, can keep track of.
- As a puppet module user in a legacy environment, I want a layer for backwards compatibility. Implementing it in a working and tested way requires a lot of effort. That's nothing we can do for our >100 modules.
- This would require a lot of tests for the old and new behaviour
- Who decides who long we support the old behavior?
- Who keeps track of that?
 
- non linear git history requires a lot of changes
- We need multiple branches
- our whole changelog concept works currently (as far as I know) only with linear history
- We would need to teach contributors and collaborators on how to use a new git workflow
 
Something that many people don't understand is, that they don't need to apply each new version of a module to their infrastructure. On conferences I got multiple times the feedback that somebody wanted to update one of our modules to the latest version, just because it exists. So yes, if you update one part of your infrastructure, you maybe need to update other components as well. We try to make those updates as smooth as possible. But those updates aren't always necessary. Also I think people should invest more time into automation and testing of their own infrastructure. It's not the fault of Vox Pupuli that people break their production environment from time to time just because they didn't test a new module properly before using it.
This exploded into a wall of text. Any feedback is highly appreciated. Please keep in mind that this is my personal opinion and not the one of the whole Vox Pupuli collective.
I'm generally in favor of what you describe but let's be blunt: we don't have the manpower to do that. A lot of the work we put in the current modules is to keep them running and working with recent versions. I do maintain multiple branches for the foreman modules and just for a handful it's a lot of work. For those modules I do it because it's part of my job and it's needed. However, Voxpupuli is in the hobby category and I can't imagine doing it for the amount of modules we have in voxpupuli, especially since I don't know/use most of the software.
Various modules have more dedicated maintainers. If someone steps up to take care of a module, they're more than welcome to raise the support level of the module to the one you describe. From a tooling perspective I'll gladly help since I already apply a similar model elsewhere and we can probably share tooling and best practices.
our whole changelog concept works currently (as far as I know) only with linear history
This was addressed by @hunner and current master of GCG should work. It does require that we create PRs for the stable branches but it's a good idea in general since it provides visibility and testing.
That is another good point. We know puppet, testframeworks, git and all the related tooling. But we don't have domain specific knowledge for all tools that our modules automate. That makes reviewing for most of the PRs pretty hard.
often a simple boolean flag old_behavior which is true per default can save many hours of work. for the developer it is only 15min to add such a flag
Preserving backwards compatibility is almost always never this simple in any software project, even more so when the codebase like this one has hasn't been well maintained and needs a large refactor to make future minor changes much less difficult than they are at the moment. Also avoiding breaking changes at all costs is a good way to discourage contributors. php and symfony are not great examples of projects with lots of contributors and a fast rate of development.
Something that many people don't understand is, that they don't need to apply each new version of a module to their infrastructure
Yes
But we don't have domain specific knowledge for all tools that our modules automate.
Also this.
well symfony has over 1600 contributors, it has a step learn curve and high requirements. i don't call that bad. also the development rate is fast and stable. you have an exact roadmap when a new version is released. also you have LTS.
but if you don't want a LTS codebase where a user of this module can simply rely on it and don't have to upgrade everything all the time, then simply close my PR.
php5.6 is soon EOL but this doesn't mean people aren't using it anymore. this is reality.