Visit the forum instructions to learn how to post to the forum, enable email notifications, subscribe to a category to receive emails when there are new discussions (like a mailing list), bookmark discussions and to see other tips to get the most out of our forum!
Proposal: Phase II for Open Source Ecology's online app
  • Hi Guys,

    My name is Boris, and I'm a front end web dev. I had an idea very similar to OpenSourceEcology, and while researching existing solutions I found your incredible project. I think you have a great idea going here, but I think that there is still a lot of untapped potential. As you're aware, open sourcing technology has the potential to completely route around corporate manufacturers, and put the means of production directly in the hands of consumers. As you're also aware, some major impediments are the lack of technical know-how on the part of the audiences whom this would benefit this most (people in 3rd world countries, low income households), as well as the lack of a truly large scale, organized community. I’d like to work to tackle the second issue.

    The fragmented nature of similar projects certainly doesn’t help either. Ventures like Thingiverse and Open Innovation Projects seem to offer similar features to your site (though with a focus on less significant products), making finding particular schematics a more difficult problem than it should be.

    Getting to the point – what if there was a github.com-like collection of schematics on the web? Coming from a web development background, I cannot understate the impact that github has had on the openness and community of the worldwide webdev community. Specifically, I think it would be immensely valuable to mirror the (1) versioning, (2) community, and (3) ease of use aspects of github. Instead of a single monolithic Tractor design, the community would constantly iterate and improve on as many designs as people need. Projects could be “pulled” and “forked” in a similar fashion to git, so that through community contributions, the quality of open source schematics would quickly rival and supersede that of their commercial equivalents. Transparent versioning would make it clear exactly what was changed, by whom, and when; everyone iterating over the same projects (rather than starting their own every time) would help build a stronger community than your current system has the capacity to support; and a ubiquitous, centralized community would draw in corporate support (ala. Twitter Bootstrap or Yahoo! YUI); finally, a more formalized repository system for schematics would allow the system to scale indefinitely.

    I would love to be a part of your project, whether to bounce ideas or contribute to your site’s code. I believe immensely in what you guys are doing, and out of the many sites offering open source schematics, I think you have the most potential. As a developer I specialize in high performance code, so if you’re interested I could do a performance audit of your site and speed everything up, so that it feels more lightweight and easier to use (especially in older browsers, which are the rule rather than the exception in low income neighborhoods and third world countries). I can also help architect and implement a git system, and help out with any UI/UX work involved. With a large, centralized community, OpenSourceEcology would then be able to more efficiently tackle the education problem, and truly give every household the capacity to manufacture what they need.

    What do you think?
     
  • 36 Comments sorted by
  • My 2c:
    I generally agree with you. I find the collection of different applications used a bit dizzying, and the discoverability across them makes it hard to find something again after reading it. And of course, versioning is handled (or not) many different ways across these tools.

    My only sticking point with what you are describing is the "centralized" aspect: while it is crucial that all of the information be discoverable in one place, it's just as important to be able to host your own infrastructure. that's part of what github did really well: you can get your data out at any point, and go run with it on a disconnected computer, and come back to the "mothership" when you need to.

    It seems the simplest way to get a DVD's worth of information out to a rural area will be, at least for the next while... a DVD. or some other physical media. What can we do to take the core ideas of git (or other DVCS), and make them better for the kinds, nature and goal of work that OSE is doing? At some point, the infrastructure for disseminating actionable information about OSE will become as important a tool as any of the physical implements.

    I think one of the best candidates for this effort is the Allura Project which was recently submitted to the Apache Incubator. It is the system used for new projects on SourceForge.net, but can run fairly well for a small team on a VM. The closest other open source project to it is probably redmine, but is built to scale much more broadly. Allura provides a large number of features: http://sf.net/p/allura/wiki/Features/, but critically, per-project:
    • wiki (like MediaWiki, but probably not as good)
    • forums (like Vanilla, but probably not as good)
    • repositories
      (like github, but probably not as good)... though it does support merge requests and forks, etc.
    • tickets (like trello, but probably not as good)
    • search
    And can be extended in different ways (new wiki markup, new whole project tools, etc).

    I am involved in a project aiming to improve Allura's usefulness for building complex cyber-mechanical systems, with a focus on modular, model-based engineering for non-software folk: this work is open source, but we have not yet been able to distribute it... it should be within the next month, though. The fully distributed piece isn't there yet, but we've got the foundation.

    Looking forward to continuing the discussion!
     
  • I agree completely about the decentralized aspect of git, although I glossed over it in my original post. As far as Allura, it looks like a great project but there are hundreds of other more lightweight options. I think that performance and scalability should be our main concerns here, as the former drastically improves the user experience in areas with dialup, and the latter allows the system to grow. My question would be whether git supports the core functionality we're after, or we need to look to less supported packages (eg. Allura). I think that it makes sense to stick to tried and true packages where possible. Also, there are dozens of existing GUIs for git (for less tech-savvy users). As far as differences between the standard git distro and what we need:

    - The repository should be curated to some degree to protect the integrity of schematics. Main branches & pull requests address this issue, but perhaps we should also integrate the CAD equivalent of unit and integration tests.
    - It should be extremely easy to verify the authenticity and source of schematics. I'm not a security expert, but a simple MD5 hash might fit the bill.
    - The interface should be geared towards housewives and farmers - as non-technical of an audience as possible. Even if the user doesn't understand how to read the schematics they are downloading, they should be able to easily download them and get them to someone who does.

    That's an interesting point about DVD's, I didn't really consider that a large part of distribution would need to be offline. Whatever ends up being used - DVDs, USB sticks, external HDDs - space is limited. And with dial up, bandwidth is limited. So perhaps a core part of the software should be compressing/decompressing with 7z or some other high compression algorithm automatically on I/O. So compression doesn't become an extra step, but is a core function that allows distribution over high latency, low bandwidth, low space means.

    What do you think?
     
  • git will only get you so far, and is, by itself, really best suited to Remembering Stuff You Really Care about... the storage of actual fab instructions, etc. there are great git-based wikis (github's gollum, for example), access control systems (gitolite) and at least a few tasking systems.... but each of these comes with a stack of dependencies on top of it... and each will degrade performance from the theoretical maximum of what git could do in exchange for user interfaces, etc.

    in terms of off-line storage and trustable distribution, git already has that nailed... after a "git gc" git approaches the theoretical maximum of compression possible while maintaining full history. "git bundle" is a smart differential binary packaging system, so you could easily do a baseline distribution, and then introduce incremental changes without sending the whole thing again. combining this with "git tag" and gpg, you can verifiably trust that a tagged version was in fact created by the person who claims to have done it, as long as you have their public key... a whole separate problem :)

    my allura and git fanboyism aside, the lightest weight distributed project management sytem would be fossil-scm.org. it's about as efficient as one can really get, and will probably run on everything on the planet... The Linux binary is 800kb, which includes a web server. it does not use git, instead providing its own metaphors and commands, and an incredibly well-thought out system by the guy that made SQLite (it's only dependency). it provides version control, wiki and tickets.

    my thinking is, however, that the current stage of OSE development is really the concurrent development of many modular technologies, which have circular dependencies, and several of which don't exist. to pull that off, a searchable, multi-project system is needed where delegation of authority is transparent, and linkages between technologies are very explicit. for this, I think allura is the most compelling option... but it would need to grow significantly to reach a distributed model.

    further, when you start talking about cad unit tests: well, opencascade, the only feasible open source cad kernel (freecad uses it) is not exactly a lightweight. further, cad assets themselves, especially when versioned, are basically incompatible with space-efficiency... I don't have any relevant metrics handy (just planes and stuff) but just loading up a drivetrain model can peg a beefy desktop computer for a couple minutes.

    so I think we are both talking about a few different modalities here:
    - the equivalent of today's *.opensourceecology.org, but more closely integrated, hosted by an ISP or whatever. most collaboration, etc in the near term would continue to be here
    - an internet-connected instance that can take advantage of economies of scale to do heavy computational lifting. if done right, this would be a great way for nerds like us to contribute: download this VM, and it will just start replicating the design repos, and executing simulations when contributors want to try out new things. this would be able to run on Linux, Mac, windows, whatever
    - a village-enabling instance that would be able to handle day to day tasks... simple, rugged, low power, etc.
    - a mobile instance which would be able to endure transit to remote locations to allow updating of a village instance... the power cube of data

    each of these would have different hardware and software needs, for sure, but I stand by a recommendation of allura for the hosted internet piece. still thinking about the other ones....
     
  • > in terms of off-line storage and trustable distribution, git already has that nailed... after a "git gc" git approaches the theoretical maximum of compression possible while maintaining full history. "git bundle" is a smart differential binary packaging system, so you could easily do a baseline distribution, and then introduce incremental changes without sending the whole thing again.

    This sounds like a great option for the repo component of the system. Stepping back a bit, SVN could be another good option - it's ubiquitous (used by the biggest tech companies), lightweight (local repos only contain the latest version, unlike git), secure, and it scales extremely well.

    my allura and git fanboyism aside, the lightest weight distributed project management sytem would be fossil-scm.org. it's about as efficient as one can really get, and will probably run on everything on the planet... The Linux binary is 800kb, which includes a web server. it does not use git, instead providing its own metaphors and commands, and an incredibly well-thought out system by the guy that made SQLite (it's only dependency). it provides version control, wiki and tickets.

    Thanks for enlightening me, fossil scm looks awesome! It feels extremely lightweight and responsive, and may be a good candidate for benchmarks once we settle down on a few final options. The front end code isn't ideal, but is pretty good compared to existing solutions - with a bit of optimization I think we could speed it up quite a bit more.

    my thinking is, however, that the current stage of OSE development is really the concurrent development of many modular technologies, which have circular dependencies, and several of which don't exist. to pull that off, a searchable, multi-project system is needed where delegation of authority is transparent, and linkages between technologies are very explicit. for this, I think allura is the most compelling option... but it would need to grow significantly to reach a distributed model.

    It looks like fossil supports multiple repos, but it seems that if we make a repo for each schematic, each would come with a bit of overhead (fossil uses 3 separate databases per repo). Is there a natural way to split up OSE's projects into a few repos, or do you think they should all be kept in a single, central database? Fossil also supports user privileges, and a wiki system would make linkages explicit. How does allura handle linking branches? Is it some bit of meta info built into the repo system itself?

    Again, Allura looks like a great start, but I would be very caution about relying on it for such a large infrastructure. I would be much more comfortable with something that's been a round for a while, and is proven to scale - maybe even a composite solution with SVN + mediawiki + (some issue tracker).

    > further, when you start talking about cad unit tests: well, opencascade, the only feasible open source cad kernel (freecad uses it) is not exactly a lightweight. further, cad assets themselves, especially when versioned, are basically incompatible with space-efficiency... I don't have any relevant metrics handy (just planes and stuff) but just loading up a drivetrain model can peg a beefy desktop computer for a couple minutes.

    I wonder if it would be possible to host something on Amazon EC2 or something similar to offer a cloud-based testing platform. I imagine the costs would grow as the project grows, so it would be important to scale funding in step with increased adoption. But I'm probably getting ahead of myself here, and this would just be an extra layer of complexity on top of an already complex system.

    so I think we are both talking about a few different modalities here:
    - the equivalent of today's *.opensourceecology.org, but more closely integrated, hosted by an ISP or whatever. most collaboration, etc in the near term would continue to be here

    Yes, the github.com-like component of the project.

    > - an internet-connected instance that can take advantage of economies of scale to do heavy computational lifting. if done right, this would be a great way for nerds like us to contribute: download this VM, and it will just start replicating the design repos, and executing simulations when contributors want to try out new things. this would be able to run on Linux, Mac, windows, whatever

    This is a fascinating idea, but is far beyond my technical expertise. What kind of technologies are we talking about? Please correct me if I'm wrong, but we'd need a central server for repos, an app running on that server that pushes any changes through a particular socket, an app that runs on users' machines that polls that particular socket on the central server for changes, the ability to execute and publish simulations in the background... It may be because it's beyond my knowledge, but it seems to me that such a system would be by far the most complicated part of this proposal.

    > - a village-enabling instance that would be able to handle day to day tasks... simple, rugged, low power, etc.

    Would we need the ability to run simulations and do any kind of heavy lifting? It seems to me that a git-like backend along with a ticket system could be sufficient here.

    > - a mobile instance which would be able to endure transit to remote locations to allow updating of a village instance... the power cube of data

    This seems very similar to the village-enabling instance, unless I've misunderstood something.
     
  • This is the discussion I just posted about!

    I think that git isn't a great tool for this because of it's reliance on manipulating plain text documents, it wouldn't know how to correctly manipulate schematics and other less text based things, and it provides no support for internal dependencies, it could be a great back-end for overall versioning, but probably not for the core part of the system.

    My thoughts on the how this would work would essentially be the creation of a new hybrid of a wiki with a version control system that can include different types of data and allow for dependencies on other pieces of data, additionally some of these data components would need to be active, like calenders and schedules, or things that are connected to machinery, ect. And it would need to be run on site to provide everything each site needed (Each site would have their own full copy, like git does) that way each site could push and pull to each other when internet is available (or through sneaker net). The global version would be the starter kit, the version you copy, the origin remote, and you could browse and search the others. [Because people in Alaska don't care about how to build desert buildings, or the recording of the performance of those building, only the people in another desert will care, and those people can go to that repository and copy that information as a remote so they can always access it, but it won't be part of their knowledge base.] And like a wiki, everyone can provide information on things, but that information will have a little more format. [Some people will need to post logs of the work, and that would be a different kind of attached data with a different way of being displayed, than the opinion of how a food tasted, than a microchip design]

    Edit: Also, the thought of making this a distributed system is interesting, obviously not every computer in a village will be used all the time, grouping them together for simulations is a great way of best utilizing resources.

    But the hard part would be building all the different parts that would need to come together for all this to work like this. I am already working on something like this for my masters anyways. Sometimes I wish everyone was a programmer.
     
  • @MasonB-

    I would be cautious about building a tool from the ground up for this. There are thousands of existing solutions that have been tested and iterated over by the community, I would be surprised if there was nothing that works for OSE. With a github.com-like setup, readme.md's could contain the equivalent of wiki content, and a dependencies.json file could contain a computer readable list of dependencies. This way the content remains decentralized, but can be easily aggregated in a central location using a single API (git), as opposed to multiple tools and protocols (or an entirely new proprietary tool). Any new, complex frontend system will be wrought with bugs, and will need extensive development and testing to get to a usable stage - this is not the case with established tools like git or SVN.

    Sorry if this sounds overly critical, it's an interesting idea but at this point does not sound very well thought through.

    Edit: Quoting from @Beluga's second article (thanks for the awesome links by the way):

    > It turns out that this is all actually much easier than we’ve been letting on. GitHub nowautomatically supports visual diffs for image files included in commits. What this means is that if you (as a designer) generate and include an up-to-date PNG version of your schematic and layout in every version and commit there, you will also automatically generate a matching visual-diff history of the project. This is potentially huge — if the open source hardware community will step up to the plate and take advantage of it.
     
  • Hey, folks, I was off traveling and didn't get to reply to this earlier.


       ... Stepping back a bit, SVN could be another good option...
        
    I am going to go ahead and say that SVN is not a good fit for the problem of interest here. Pretty much everything good that I said about git is not true about subversion.

        ... with a bit of optimization I think we could speed it up quite a bit more....

    Optimizing of fossil any further would be really hard... to what parts exactly are you referring? There are some more skins and stuff out there, but I imagine the fossil source itself would be a better way to learn Really Good C than anything else.

        ... or do you think they should all be kept in a single, central database ...
        
    This is a long discussion, to be sure! Here's my basic idea, working with the Allura metaphor:
        - There are at least 3 "Neighborhoods" of http://opensourceecology.org
            - /gvcs (for tools)
            - /village (for replicators)
            - /user (for contributors)
        - The GVCS tools have one "Project" per Tool: (ex: /gvcs/ceb-press)
            - a Project has multiple Tools: repos, tickets, wiki, etc
                - /gvcs/ceb-press/cad <-- this is a repo
                - /gvcs/ceb-press/wiki <-- this is a tool
            - a Project can contain a structured data link to other projects, which are extracted when a project is extracted
                - systems like this are already mature and in use for doing amazing things, like Android:
                    - https://github.com/abstrakraft/rug <-- i have done some work on this
                    - http://ant.apache.org/ivy/
        - A Village has a page in the /village neighborhood, as more of a documentation mechanism more than anything else
            - for example, whatever version of /village/factor-e-farm is pointing at could be considered the "last known good" version

        ... more comfortable with something that's been a round for a while, and is proven to scale...

    See SourceForge.net, which runs Allura. I don't think the scale argument is appropriate. It will also scale down and run on a single VM (see the Vagrant instructions).

        ... visual diffs for image files included in commits ...
        
    The image diffs are fantastic, but can be challenging to maintain consistently over time: camera angles, etc. Diff'ing of STL/STEP is the next critical step, as well as useful diff of other systems models to show change over time.

        ... host something on Amazon EC2 ...

    That is certainly an option, but keeping the needs of the system in something that *can* be owned by the community is very important.

        ... an internet-connected instance ...

    This is a large, but not impossible undertaking. The closest analogy to this would be SETI@home or Folding@home. It's even possible that such a framework as those could be used, but the OSE "problem" would require much more diverse, and less optimized, simulations, hence the generalized "computer". The biggest challenges here, actually are:
        - security:
            - let's not build an evil botnet
            - allow the hosting user to set bandwidth/processing/etc limits
            - perhaps a distributed VPN (several exist)
            - simulation results would likely need to be signed, tied to users, etc.
        - updating
            - we'll need more stuff in the future. chef or puppet would probably fill the bill, though something like Fedora Spin would be good here.
        - incentive
            - why would a user donate their processing time/bandwidth?

    So i am basically thinking:
        - VirtualBox
            - a Linux distro: Ubuntu would seem like the no-brainer, what with the Shuttleworth thing
                - Simulation engines:
                    - CAD engine (OCC/BRL-CAD),
                    - CFD (OpenFoam),
                    - thermal (HOMER),
                    - electronic (Ocarina),
                    - hydraulic (Modelica)
                - a configuration management system:
                    - Puppet
                    - Chef
                    - ... or just use the yum/apt system, depending on tightness of coupling
                - data transfer:
                    - notification: XMPP (ejabberd)
                    - design data transfer: git/hg/svn/fossil over HTTP(S)
                    - ... or something like BitTorrent...
                - signing: gpg: not a lot of options here, and the key system still needs help... similar to what these guys need: http://freedomboxfoundation.org/
                
    That's the guts of it, in addition to the server-side systems... but really, that's probably just hooks on repositories. You download the VM, generate a new GPG key with a little on-screen menu, add that key up on the website. Next time a user wants simulation, there is an "auction" process on XMPP to determine who should do it, and who should validate those results. The VM downloads the neccessary assets, follows the modeling instructions, and pushes the results back into another repo, with the results tagged.

        ... seems very similar to the village-enabling instance ...

    Similar: you're looking for something that routinely can survive riding in the back of a truck, and still provide reliable data transfer. However, it doesn't have to have much, if any, computing capacity, whereas a village will need, at least, the ability to do training videos and run CNC tools, even if it can't do 3d rendering very well.


    Glad to see more pitching in insight!

        ... wouldn't know how to correctly manipulate schematics and other less text based things  ...

    Git's actually really quite *efficient* at doing binary stuff, even if it is not *smart* at doing diffs of arbitrary binary... unless you tell it to be smarter with http://git-scm.com/book/ch7-2.html. It breaks up the file, and only stores the "chunks" once.... great as long as your data isn't encrypted!

    For the time being, there are not a lot of options for generalized binary diffing. The most compelling I have seen are things like upverter, which uses a variant of EtherPad\Google Wave, which is what drives Google Docs, for example. Another interesting one is polyglot, which actually uses a macro language (autohotkey) to translate between different CAD systems... so it figures out what *mouse clicks* to do to recreate an action.

        ... provides no support for internal dependencies ...

    Git actually has several ways to do this:
        
    They're really pretty powerful. We went a different direction with https://github.com/abstrakraft/rug (mentioned above)

        ... hard part would be building all the different parts ...

    That's what this is all about. Right now, a lot of the developer efforts of funded OSE work is on the CAD tools themselves, such as FreeCAD. However, I can foresee a time when the needs we are discussing here become very important indeed.
     
  • Regarding massively-distributed simulations on a general-purpose Linux backends: just saw this on the ol' hacker news.
    https://gridspot.com/compute/

    They offer a "worthy pursuits" grant deal, for which I would see OSE qualifying:
    https://gridspot.com/grants

    Further, they do some pretty cool energy stuff:
    https://gridspot.com/gridspot_safe
    most notably:
    "Thus, we decided to only run computations when the outdoor temperature near the user is below a certain level (currently 16 degrees Celcius). When it's that cold outside, we assume that the computer's room is being heated anyway. All of the electricity used to do computations gets turned into heat, according to the laws of physics. So the heat generated by the computations displaces the need for heat generated by a heater, eliminating or minimizing the net elecricity usage."
    Neat! Though I don't really buy it (an outside temp of 16C wouldn't preclude the need for cooling of more than a few servers), but at least they are thinking about it.

    Not that I am suggesting that this be THE platform, but once something like this exists, it becomes more likely that someone would figure out a way to cobble an equivalent together out of open source components, and make it generally available... the most important thing they have here is the two-sided market model, which here is based on money, but could be based on reciprocal processing (i'll run your model if you run my model, later) or something more appropriate for inside a community of practice, rather than a marketplace.

     
  • @bollwyvl-

    Optimizing of fossil any further would be really hard... to what parts exactly are you referring?

    I'm mostly a front-end guy, so what sticks out to me are the client-facing components (HTML, CSS, JS). If those are any indicator of the back end code quality, I am not so sure about using it. However it sounds like this isn't the case, and the backend is well optimized?

    See SourceForge.net, which runs Allura. I don't think the scale argument is appropriate. It will also scale down and run on a single VM (see the Vagrant instructions).

    I didn't know that's what they run, thanks for the tip. My concern would still be about the heavy weight and complexity of such a system: repo, issue tracker, wiki, internal communication, etc. could all be handled by a repo system with a GUI, albeit less robustly. This isn't my area of expertise, but what's wrong with using something like readme.md's for "wikis" and issues.json for issues, with everything in a simple git repo? That way changes and users are all handled by git (and not separate systems for the repo, the wiki, and issues), and all that's missing is a front end plugin for making issue tracking easier. The rest can be handled by a git GUI, which allows for easy access via command line or web browser depending on the user's needs.

    And just to clarify, in your proposal the repo system would be usable via any machine and not tied to the particular VM, correct?

    That is certainly an option, but keeping the needs of the system in something that *can* be owned by the community is very important.

    Agreed, I'd definitely favor a decentralized system.

    This is a large, but not impossible undertaking. The closest analogy to this would be SETI@home or Folding@home.

    This is a fascinating idea, but a bit over my head. Let me know if there's anything I can contribute to the front end of it :p


    This looks very cool. Can OSE channel some donation funds to this, or do we need to look elsewhere?

     
  • @bcherny

    > This looks very cool. Can OSE channel some donation funds to this, or do we need to look elsewhere?

    From the look of it they would probably provide the OSE project with the computation for free.

    @bollwyvl

    >[Comments on git]

    Ahh that's right, I had forgotten about git's hooks and attributes. And you make good arguments on the internal dependencies, I hadn't thought about making the repos that small...

    I always have this problem of wanting to reinvent the wheel... and the materials that make it up while I am at it...

    > This is a long discussion, to be sure! Here's my basic idea, working with the Allura metaphor: [Stuff]

    I would argue that while the layout you suggest is perfect for quite awhile, I still think the idea of having a repo for each village with it's own set of tools and other information is the better endgame. Even if they do all just point straight back to the factor-e-farm remote, so that when a replicator makes changes he can log them separate from the "official" version. Essentially a village is forking (in github parlance) the entire set of information, and they themselves may internally want to maintain their own branches. [This of course causes an interesting problem, how do you lookup the electronic documentation on a physical thing? Put QR codes on everything that contain the hash signature of the commit used to make the tool? Tag the commit with a unique id (or generate a string of dictionary words from the hash signature of the commit) that is engraved on the tool?] Anyways my argument is that villages should be the largest unit of organization and therefore allow each village to manage their users separately.
     
  • @MasonB-


    I still think the idea of having a repo for each village with its own set of tools and other information is the better endgame.

    I may be misunderstanding you, but doesn't that defeat the purpose of git? The point is to crowd source designs in order to improve upon existing content (by iteration or branching), not to fragment the already fragmented community. Villages should definitely be able to fork off and optimize designs for their specific needs, but isn't there value in encouraging them to upload their designs to a central repo, so another village can then use the modified designs?
     
  • My point is that we aren't embracing the git design fully. (Or we are talking across each other) With git, everyone is a central repo, and therefore no one is (although there may be unofficial "main" repos) The thing is, the way git works, everyone posts their designs, if the "main" repo wants to include another repo's content the main repo pulls it in (not everyone uploading to the main repo, that would get messy) . Basically by publishing your fork you have already uploaded your versions. Basically like linux does it, except with the GVCS instead of the kernel.
     
  • *Slightly off-topic but important*
    Haven't been here for a while but still love the idea. I found out today that a free version of Inventor Fusion came into the App store and hence I wanted to play with it. So I looked up some schematics from the tractor model and started playing around with it. After finding out that some schematics were wrong I digged further in the website and found a completely different manual and different parts of information were floating elsewhere. 

    No offence here, I think I can call me a bit more of an IT person but not as good as you guys here but if even I cant figure out where the information is located then how will a construction builder who is fantastic with it hands but is less in the IT world can figure it out?

    So yes, there is indeed a big demand for a sophisticated version based system that keeps track of user input, time stamps, version checks, missing parts, etc... 

    I would love to help out in this and here are some small idea's.

    It would be great if it can be online but indeed an offline button should be available. Like github would be wonderful. Maybe a python based program combined with html documentation. That a person can sync with github as a visitor and has the latest version offline available. Databases can be created via plain text files and image/documentation directories while python / php / html can generate the necessary information. It should be extremely user friendly and should have no install requirements like mysql or something similar. 

    I think you should have different levels, just downloaders, submitters and the coders whereas the downloaders just want to get the information and start reading the manuals. The submitters can add extra documentation and the coders build the program.

    #bcherny
     The point is to crowd source designs in order to improve upon existing content (by iteration or branching), not to fragment the already fragmented community. 

    Indeed this is very important. 1 source only. I also read something about dvd documentation and such. Then that is the beauty of github / like tools. You download it once and you have the latest documentation. One or two months later you can ask a friend to download the latest version, USB it and post it or you drive to the library and sync your github to get more information.

    I would love to participate in this project and bchemy you look really willing to do it so hopefully you dont drop out and something beautiful might start here.

    #Edit... I think it might also be a good idea to have a warehouse of parts. Every part gets its own unique ID and if certain parts are usable between projects than the part does not need to be re-created but the ID can be used with the belonging CAD designs and such.

    #Edit, github,
    There might be indeed some issues regarding github. When using for example SQLite for storage of information I think things will go wrong here. If user X adds a part and user Y adds a part and then start syncing the database file which is binary i think will cause conflicts as it cannot be merged. I think that if we want to use github which I think is a great way for distribution that there is an html form present in the github depository where you can add parts. The html form will cause the creation of new directories with information which can always be synced. Use hash codes for part id or something similar. This will prevent conflicts from occurring. Or there should exist a sync method?

    Another thing is that I think its good to get just started and see what kind of hurdles are encountered.


    #Another edit, linux distro,
    Indeed as I read here above (bollwyvl) it might be a nice touch as well to setup an OSE ubuntu distro with all the necessary components and a nice wallpaper :)

    This will then of course be a separate distribution. I dont know who has a fast upload? Anyway that is perhaps something for later.

    Just came across http://owncloud.org and looks very very interesting...
    Have a set-up now running on my laptop and I think it almost matches all our demands or I might be mistaken.
    There are a few down sides. Users cannot register, they have to be registered by an administrator.
    But what is possible is to create a user called OSE that has access to all the files but cannot edit them. This allows them to download everything via a Webdav client and keeps everything in sync. As owncloud is continously being updated new versions will arrive soon with more options. 

    Ah and another thing... Its completely open source!

    If I have some other thoughts I'll keep modifying this post until a new post arises...


     
  • In my previous post I got a bit carried away and here I wanted to show the possibilities of ownCloud here before it might get buried.

    ownCloud is a software suite that provides a location-independent storage area for data (cloud storage). The project was launched in January 2010 by KDE developer Frank Karlitschek to create a free alternative to commercial cloud providers. In contrast to commercial storage services, ownCloud can be installed on a private server at no additional cost.

    A demo can be found here

    I have it currently running on my laptop and the installation was flawless. After some chmod steps everything was up and running.

    So what can owncloud mean for OSE?
    One location storage facility where everybody can sync with to create an update or to create offline availability of all available data. As it is not really created with the intention for massive user usage it is still possible. After the installation one user is created with all rights. From there on it can create new users with either limited access or admin access. Admin's can create new users as there is currently no registration button available. The downside I can think of right a way is that we need to host the server, or somewhere, with no or a large cap on data transportation. 

    Why might owncloud still be a viable option?
    Users can upload there work to the cloud and the server takes care of versioning. Other users can sync with the server to keep up to date.
    It can handle movies, pdf, odt out of the box (read/view only). It is possible to create plain text files were people can work on from either the server or from there computer. Folder sharing is available to users and/or groups.

    There are also applications being developed and more are coming on the way. It is of course also possible to create our own applications that suits our needs.
    Like perhaps a registration function so admins don't need to create user accounts.

    Synchronisation can be achieved via supplied applications from owncloud but I would advise to use the webdav sync method as shared folders are not synced to local storage.

    So my question to you!
    Would you think that this is suitable for what we need for OSE? If not, what is missing or perhaps what kind of applications do you have in mind?
    If so, say so too! And tell what is so good about it or what is still needed...
     
  • So what's the process here to get the ball rolling? Is it up to the community to build software which is then hopefully adopted into the OSE codebase, or can we get official support from the core OSE team?
     
  • That is indeed a very good question. As it might take time for OSE to respond, I think for now it would be wise to build the application as friendly as possible, run some trials, and in the mean time contact OSE for a migration possibility and a demand list.
     
  • I contacted the founder and a few higher-ups weeks ago, with no response. It would be nice to hear whether they are at least in theory interested in something like this. I want to avoid this idea ending up as yet another schema hosting site, as in my opinion OSE is going in a very promising direction and is off to a great start. The project is still young and has a lot of room to grow, but it only has as much potential as the site's leadership is willing to allow..
     
  • Vote Up0Vote Down
    hosef
     
    August 2012
    I am very interested in this as I have some experience in web development. My web experience is mostly in using Git and a CMF called Drupal.

    I had a few observations about the structure that I want to throw out there.

    First, Git is definitely a good choice for this since you can pull the repository from the central server, and act as the central server for your village without any problems.

    Second, we do need to keep in mind while designing this, that a GVCS tool can also be a part of another GVCS tool(power cube, battery, motor, etc).


    Third, it seemed to me that the web tool that you want to create already
    exists in the form of Drupal.org. I would suggest going to look at a project page, an issue page, and the issue queue for Drupal Core to see if this will fit what you guys were thinking of.
     
  • Not sure if Drupal would do the job but I think it needs to be some sort of warehouse. Like you said one part can be part of other parts, but you can go further with that...

    Every part, bar, screw, plate, etc, needs an ID. 

    Then you can for example say:

    Engine needs 4X ID234   3X ID148 1XID987

    Tractor needs (Engine, can be given an ID) and 5X ID590 4X111 etc...

    Then Village can say, we need Tractor, which has an ID, and then all the material that is needed is extracted and all schematics are obtained.

    This is just an user example for someone who needs to extract data.
    When a new person comes along and wants the equipment used in Village#1, he might just say give me all from village#1 and everything is acquired in a nice manual.
     
  • Vote Up0Vote Down
    hosef
     
    September 2012
    So, I had some more thoughts.

    I am assuming that we will have a repo for each tool. When some one creates a customized version of a tool, will they be branching or forking from the main repo? Branching would make it easier to push back into or receive updates from the main project branch, however we would need to be careful with that as the repo could become quite huge with many versions of the same binary file. Perhaps we should have both opotions. A branch option for when you are collaborating on the same project and intend to bring the changes back into the main branch, and a fork option that will cause the website to spawn a new repo with the contents of the parent project as the initial commit hen you don't plan on bringing the projects back together.

    We will have users, which are people, and villages, which are groups of people. Is each sub-project owned by a user or a village?  I believe that it will probably be easier on the maintainers of the website if a project is controlled by a user, and then a village can say that they have built one. The project owner should be able to grant and take away other users access to the project and should be the representative of the village for that project. There should also be some appeal process that the village can go through to assign the project to someone else if the project owner dies, goes insane, leaves on unfriendly terms, etc.


    >When a new person comes along and wants the
    equipment used in Village#1, he might just say give me all from
    village#1 and everything is acquired in a nice manual.

    That might possibly be a very complicated piece of code, however it would be an awesome user experience feature. Something else that would be possible would be to ask if you want to include instructions and part lists for tools needed to make parts for other tools. For example, when a user asks for a tool a prompt would come up saying 'You have requested instructions for a tool that requires  part ID1234. Part ID1234 needs to be printed on a 3D printer, but it appears that your village does not have a 3D printer. Would you like instructions for a 3D printer?'
     
  • Hi Guys,

    I'm a front end web developer in the San Fransisco bay area.  This is an inspiring project and I'd very much like to be involved.

    I think that we should keep project hosting as decentralized as possible.  This way there is no central point of failure, and network/computing resources can be more easily distributed across a village, state, or the globe.  We should be build less of a single, monolithic system and more of a platform that can be easily replicated and interconnected.  A project maintainer should be the primary source for the project assets, but anyone who pulls it would also serve project assets (P2P model).  A user would search for designs through a system that more closely resembles a search engine or content aggregator than a central hub. 

    Also, I think we will have more success in having our platform widely adopted if we offer the full collaboration platform.  By facilitating collaboration and project management we can not only not only encourage participation by making participation easy, but also embed best-practices into the workflow.

    I think we can take inspiration from the Bettermeans project.  Their open enterprise model would be a good starting point for creating a decision-making and collaboration platform that would encourage community participation without authoritative power structures.  This would, in turn, encourage more participation in open hardware projects instead of the current economic system which encourages the use of closed development models.  This would lead to faster evolution of the open hardware movement as a whole.

     
  • Vote Up0Vote Down
    bchernybcherny
     
    November 2012
    @bcartmell would you be interested in focusing on the server side or client side? I'm also bay area based, and have a client-side focus.

    I'm not hearing any input from the admins here, but I am convinced that what we've been discussing would be a positive direction for the project. If you're interested, we can start an OSE spin off along with any of the other commenters here that are interested in participating, with part of the founding mission being merging back with the core OSE project once we have a solid working product. Thoughts?
     
  • Vote Up0Vote Down
    Beluga
     
    November 2012
    @bcherny I guess bcartmell is client side ("front end dev"). Btw. did you see this yet: http://www.knowable.org/
    Quote from the bottom of their front page:
    Building versioning and forking tools.

    Yes, we are big fans of tools like GitHub. If only there was something similar for physical objects, too. Well, we are working hard to make this a reality. You will soon be able to clone projects and use them as groundwork for your idea.

     
  • My talents are best suited to client side.  

    knowable.org looks very interesting, perhaps they'd like to work with us.  I see no harm in reaching out to them.

    As for creating a spin-off project, I think we should try to avoid fragmenting the community here if we can.  Also, we should try to make this system so that it can integrate well with the Linux distribution in addition to the web interface.
     
  • Vote Up0Vote Down
    bchernybcherny
     
    December 2012
    Knowable looks very cool, though they seem to be geared more towards first-world hackers and the social aspect of open source, rather than concentrating on improving the core aspects of open sourcing hardware (repo, simulator, distribution). I would love to work with the existing OSE platform, but the admins don't seem to receptive - should we go forward with this, or do we first need an OK from the organizers?
     
  • I'm in favor of moving this project forward.  Although, we'll need some backend engineers.
     
  • Vote Up0Vote Down December 2012
    Some stuff happening around the broadest system-focused collaboration thing that I know about:
    http://forum.opensourceecology.org/discussion/965/vehicleforge

    I think the original title here, "Proposal: Phase II for Open Source Ecology's online app" is the right idea:
    https://github.com/bollwyvl/ose-it-proposal/blob/master/PROPOSAL.md
    Due to the shift in the dialog above, I have changed the title to reflect that we really need to look at the roadmap for the effort, and not just improving the experience today.

    Starting here:
    http://opensourceecology.org/wiki/Category:Software
    I think we can document the current software needs of the community, both from a collaboration standpoint, as well as a design tool standpoint.

    Thoughts on the approach?
     
  • I think we need to  make a well defined feature list.  I'll draw one up.
     
  • Vote Up0Vote Down
    XienixsXienixs
     
    December 2012
    I have a Mac mini that I basically only use for media center / downloading. Shall I set it up as a play ground for this project? I know it's not heavy machinery but I think it should be a light weight application anyway. What you think?
     
  • Hi guys, 

    Sorry for the delays getting back.  I've been away for the holidays.  Here is a very rough first pass at a feature list, please review and offer any adjustments you think would be good.


    Version Control:
      - Fork and merge
      - Branching
      - image auto-generation from CAD 
      - Image diff
      - 3d diff
      - Modular feature set
      - Auto compression

    Distributed (P2P):
      - With local backups for some items

    Discoverable:
      - discoverable
      - Searchable
      - Browsable
      - Interlinked
      - Modular

    Interface:
      - Intuitive
      - Easy to use

    Ticket and bug tracking:
      - Collaborative
      - Tickets link to relavent changes
      - Collaborative decision making

    Manufacturing integration:
      - Output to 3d printer
      - Output to CNC


    @Xienixs: I don't think that we are not quite ready to worry about the dev platform yet as we are still solidifying requirements.  When we are ready to begin development, I think it should be built and run in an open environment.  I'm sure that the yet-to-be Engineering team will have valuable input here as well.
     
  • @bcartmell that looks good to me, and I'm sure that anything we missed will come up pretty quickly once we begin development.

    We need a better way to communicate, as I don't think any of us check this thread daily - anyone opposed to a google group for this project?
     
  • I agree that we need a better collaboration system.  Moving to Google Groups sounds like a great option right now.  Do you want to set it up?

    I would also like to look at using Better Means, minus compensation features.(video)  There doesn't seem to be much activity with it anymore and I can no longer find their main page, but it is still on GitHub.(here)  If their system works well, I think it could not only benefit this project, but could possibly end up being a beneficial tool for the larger community here as well as provide insights on how to manage group-based decision making and collaboration in future, self-sustaining, and open villages and societies.
     
  • @bcartmell @Xienixs @bollwyvl @Beluga @hosef Our new google group is here - https://groups.google.com/forum/?fromgroups#!forum/osephase2. Please request an invite when you see this and I'll add you. I hope groups will be a better collaboration platform for this project, and will allow us to communicate more freely.

    @bcartmell Better Means looks very interesting, I'll take a closer look at their code soon. I'd also be curious what led their development to stop.
     
  • Requested membership to the Google Group.

    Regarding Better Means:
    When I first found the project, they were trying to sell it as a hosted service.  I suspect that they weren't able to make a profit this way.  But it looks like development of the software is continuing as there are commits just 7 days ago.  
     

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Login with Facebook Sign In with Google Sign In with OpenID Sign In with Twitter

In this Discussion

Loading