Standardised inter-cyborg protocol

06:39 <GarethTheGreat> we should both agree on a standard control module
06:39 <GarethTheGreat> if it can be standardised we could combine resources more easily
06:39 <Diethyl> yeah there are some similarities , I mean you have influenced me 
06:40 <Diethyl> but there are a lot of extra things that I've added , that I'm certain no one has thought of yet
06:40 <GarethTheGreat> i'm going to build around intel edison just because it's an off the shelf part
06:40 <GarethTheGreat> any extras can be added in the bluetooth network
06:40 <Diethyl> it would be nice to standardize things or at least communication protocalls 
06:40 <GarethTheGreat> bluetooth already exists
06:40 <GarethTheGreat> we could standardise on top of that
06:41 <Diethyl> I mean more in the way of exchanging data physically 
06:41 <GarethTheGreat> well, realistically it has to be wireless
06:41 <GarethTheGreat> and why reinvent the wheel?
06:41 <GarethTheGreat> bluetooth and wifi already exist
06:41 <Diethyl> so when we are in public we can sense other cyborgs around us
06:41 <GarethTheGreat> ah
06:41 <GarethTheGreat> that'd be cool
06:42 <Diethyl> that and a handshake
06:42 <GarethTheGreat> brb
06:42 <Diethyl> to exchange data
06:42 <Diethyl> okay
06:43 <Diethyl> basically a protocall to exchange keys for a two way varification system
06:44 <Diethyl> I think it would be fun to be able to have the handshake actually be a physical handshake
06:47 <GarethTheGreat> could have each control module broadcast an ad-hoc wifi network by default named "CyborgNet"
06:48 <Diethyl> I'm thinking the implant would be somewhere in the forearm near the elbow , it seems like it would cause less problems there
06:48 <GarethTheGreat> have a UDP broadcast that contains basic details
06:48 <GarethTheGreat> name, supported features/implants, etc
06:48 <Diethyl> you would do a forearm handshake and you would be aligning the rfid on your hand with the reader on the other persons forearm
06:49 <GarethTheGreat> mind if i copy/paste these logs to the forums?
06:49 <GarethTheGreat> it's a cool idea that could do with some input
06:49 <Diethyl> yeah that would be awesome

Tagged:
«1

Comments

  • For those unwilling to parse the IRC logs, here's the basic idea:

    An ad-hoc wifi network broadcast by an implant (the control module for my exocortex project) with the SSID CyborgNet, where each node broadcasts a UDP beacon advertising it's presence and giving details (user's name, list of implants etc, basic biographical data and optionally current GPS co-ordinates).

    Combined with the subvocal recognition and bone conduction transducer this could also be used for silent communication cyborg-to-cyborg, and if both have an eyetap then realtime video streaming could be shared along with other arbitrary data.

    Theoretically all of this stuff could be done with wearables to get mainstream appeal and it's the kind of thing that could be crowd funded on kickstarter or similar.
  • So the way I was planning on doing all of my implants is over bluetooth LE this way things last longer, and I'm going to use my phone as the hub.

    So I'm going to have the transducer (hopefully implanted at grind fest) transmit any phone auto to my implant. It also has a mic on it, so all you really need to do is come up with some kind of a connection between phones (in my case). But I really do like the edison idea. If a standard was set I think it would be cool if everyones implants connected and became a mess network. 

    Do you think a phone or a dedicated device is better to act as the hub/server for the cyborg body?
  • A phone is a multi-use device, i'm talking about something specialised.
  • I, personally, find this idea both intriguing and slightly terrifying at the same time. Would you have all of your implants interface with the broadcasting implant? While that could be useful in some ways, it would also provide a much more direct route for hackers to meddle with implants in potentially painful and irritating ways.

    On another note, constantly broadcasting Wi-Fi is going to put a fair drain on battery systems. By the time this actually gets into the prototyping phase, I expect that we'll have developed graphene super-caps far enough to actually utilize them in a design.

  • Security is obviously always going to be an issue, but if implemented right this wouldn't introduce more security risks.

    As for wifi, well i'd be using it constantly anyway.
  • Oh this is so cool.

    You could easily build this as a wearable, at least for proof of concept. 

    Basic housing encased on the forearm and transmission coming through on the palm. NFC would make it low-ish power, direct, and possibly more secure than transmitting all this data over WiFi.

    You're effectively just exchanging a contact card. Combined with some decent AR glasses, you could communicate quietly across the protocol. You could even use phrase lists to communicate without speaking. 
  • edited September 2015
    There was an anime I watched (baccano) where there was a group of imortals. And the only way they know if someone else is an immortal (short of straight up knowing them) is when asked they're name they cannot lie in the presence of another immortal. I do like the idea of somethign like this. Could easily be an app rather than a thing we all have to go implant though. Or if it's a implant keep it simple. Have it constantly broadcasting a signal, bluetooth maybe but i'm not sure and only looking for that same signal, nothing else. Have it set so that if it picks up the signal it'll administer a small pulse that you'd be able to feel. You could also make it get "louder" the closer you got. Although again, an app would be great for this since im more likely to remember to charge my phone than a chip in my ass. just have a notification pop up when you've been near someone else. There are already apps that do this, just rework it to be just for grinders. If it takes off, eventually we could make it into an implant.
  • @BodyHackingCon NFC in the palm wouldn't be long distance enough, or feasible to implant into the palm with another device elsewhere in the body, not to mention the palm isn't a good place in general for implants due to risk of damage.

    As for comms, see my exocortex thread - subvocal is what i'm looking into.
  • @Chironex the goal is to integrate with implants, obviously you can do something similar for finding others just using a phone but that means no silent comms and other unforeseen future applications
  • ya but you can't make an implant based on "unforseen future applications". It's an implant. I think you may be forgetting that this thing would be going INSIDE YOU. You can't just pull it out to tweak the thing. So either you go the full 9 yards and make a system that would be great if you've got a group of people who meet often and want to communicate without speaking, or you're using it to find people. The latter is easy and testable. The former, less so. Your subvocal thing may do the trick but that's a long way off. I'm not saying this is a bad idea, just maybe a project for down the road when we've worked out some other things that are needed to pull it off. Like your subvocal thing for example. And better coatings. etc etc One other way you could do this is have it work almost like sonar. Have it listening for the pulse and broadcasting one. If it gets one, immediately send another. If there's actually someone there, theirs will do the same. Then you could have a magnetic switch in the implant that would send a pulse when pressed. Then you can just morse code the shit out of eachother.  Of course this could be scaled up eventually to replace morse code with like a screen, or a neural connection or something. But this is what I mean. Full on communication is a pain in the arse. But a fun and simple implant could be neat, especially if it's cheap and easy for people to get.
  • You can build a platform for unforeseen future applications by providing all the hooks and interfaces for future upgrades.

    I really plan to go ahead with building a general-purpose embedded computer suitable for implantation for precisely this reason.
  • To clarify - a general purpose computer can have new software loaded, and it can also be augmented with other computers over a network link.

    Need more CPU? Stick another implant in or use a wearable or an internet link, no need to pull the primary out.

    Mobile internet is getting good enough now that it's feasible to do the heavy lifting on a remote server.
  • ya but then you end up with what I like to call old city architecture. No matter how shiny the new buildings are the old ones are still there at it's core and if you base everything on it, and it fails, you're fucked and have to start fresh. I can appreciate the idea though and by no means let me stop you, infact i'd encourage you too since if you pull it off i'll be your first customer. But designing what is essentially an app for a gadget that doesn't exist is odd I feel.
  • My concern is what exactly you plan on processing inside you. Why implant a computer for networking when a phone is a better solution? Subvocals, body measurements like temperature and glucose, those are great. I just don't see the need for a generic platform to be implanted. You'd have to physically modify it to add any functionality, so it would be better to have a distributed system.
  • I think there's some misunderstanding here - what OP is proposing isn't a generic platform, it's a generic protocol. Think HTTP - it can be implemented on multiple different platforms over a large period of time.
  • I'm proposing both - a protocol and a recommended implementation of that protocol.
  • For implementation, Bluetooth LE might be better. It is perfect for this application as package size is small (so a handshake is possible), but eats so low amounts of energy that a simple 40mAh battery can keep the thing up for a month with one charge.

    So for discovery I think BLE beacon functionality would come handy, and after the handshake/protocol exchange/pairing, information channel could be opened over e.g. WiFi.

    This way the beacon/discovery part is completely low-power, while high-power high-bandwidth transmission can take place when needed.

    The protocol itself for high-power transmissions could even be HTTP - it's well-developed, easy to adapt to, and pretty much handles every client-server interaction (and to be honest, this kind of communication IS client-server communication, with the roles being swapped continuously). And the data sent could be in JSON format: easy to interpret, fast to process, and generally capable of sending any kind of information - from contact cards to complex objects.

  • This could have some great in field medical applications if you have sensors that can provide vital stats that are updated in real time; could use that information in teams or groups of people (military or athletic training) to identify heart rate, how much oxygen used, blood pressure, hormone levels..etc. You could make it connectable to a wireless interface as well to record a log that data for further analysis
  • For medicine there's already plenty of solutions, though I suppose a standard would be cool.
  • Starting to design this protocol in detail
  • So here's how the protocol is looking so far in terms of features, code to come soon:

    • TCP/IP or bluetooth transport
    • Various modules and a hub - idea is things like sensors and other implants are modules and talk to the hub
    • Modules can be physical implants or pure software
    • Event feeds and streaming binary data with pub/sub pattern
    • Input feeds - modules can accept events or binary data from other modules without needing to subscribe
    • Cryptographic authentication of modules to the hub and pairing
    • Public/private pub/sub feeds
    • Service advertisements - modules can advertise arbitrary network services, accessible over wifi network or bluetooth
    • Low CPU load (so small AVR chips etc can work fine)
    For an example of how this might work:
    You pair a subdermal bone conduction bluetooth headset with the hub, it advertises a private raw binary feed of audio data from the microphone and subscribes to another raw binary feed of audio data from a mixer module (implemented in software).

    A speech recognition module subscribes to the audio feed and publishes a text feed, which the UI module subscribes to - the UI module publishes another text feed that the TTS module subscribes to, the TTS module publishes audio data to the mixer's input feed.

    The mixer then publishes audio data to the bluetooth headset's input feed and the user hears the response.

    I realise this sounds a tad over-engineered, but the end goal is to make a general platform for linking implants together and linking people together.
  • Any interest in making a block diagram example?
  • I'll knock up a diagram and post it later this evening
  • image
    So here's a diagram of the basic UI loop, you can see how all the modules fit together and enable swapping out of various bits (you could for example swap out the bluetooth headset for a subvocal system or when the tech gets there a more advanced invasive BCI)

    The text streams are event feeds - they send single events rather than a constant stream of data, while the audio streams are binary streams. The links between the modules are not direct, everything is routed through the hub, which handles connecting across different transports (wifi and bluetooth and any other transports yet to be developed).

    I plan on adding stuff like a datalogging module that subscribes to feeds from any sensor modules and saves the feed to disk by sending to a storage module (doing it this way allows abstracting away the underlying storage mechanism).

    The feeds in this example are all private - that is, they can't be subscribed to except by a paired module. For communications between cyborgs a feed can be made public, enabling a module not paired to the same hub to subscribe to it or to send events/data.

    In terms of implementation i'm knocking together an alpha in python to get all the low-level protocol details right before releasing a proper spec, then i'll rewrite the hub in erlang and write a simple C library for modules to use - the C library should be portable so that it can run on microcontrollers while the hub does not need to be so portable and so it can depend upon the nice features erlang offers.
  • Other modules i've started designing are things like a haptic feedback module, tDCS and CES stimulators etc.

    Put simply, this thing should allow for a slow modular upgrade of capabilities in the human body in a way that makes all the different parts able to work together.
  • garethnelsonuk Link is broken.
  • I'll be moving stuff around a lot, it's in doc/ now
  • I noticed you mentioned that device which throw error messages would need to go through the handshake procedure again. How much memory gets chewed up running through that procedure?

    I think it would be wise to have some sort of active troubleshooting element on the hub that attempted to fix the error generated by a module, before it breaks the connection.
  • Running through the handshake again would just mean running a loop - you don't use any extra memory, just extra time.

    Cutting the connection is the cleanest way to handle errors if there's limited resources in the modules.
Sign In or Register to comment.