Control interactive prototype's behavior with custom code (Javascript, python)
Basically XD is a step in the right direction, but from our understanding it does not really support scripting such as JS or similar, and so we can only prototype using static images, instead of HTML5 code we want to use to test the overall look of the planned website.

3 comments
-
Anonymous commented
a python coder where the inputs can be integrated with triggers so if for example you click a square, you can triger something with python code, and the print command can be intergrated with text boxes
-
David commented
Java Script integration into the design or Prototype mode. Clearly there are insufficient capabilities when it comes to States, micro interactions etc. An art board is a webpage and the ability to trigger other states from any other component on the page should be available. I expect a third party plugin developer will solve this before Adobe.
-
Paul Mackinnon commented
OK, I have been working with students on advanced interaction design where we have used HTML, CSS, JS to achieve the desired results.
Now after this project, I see that many students are designers and will prefer UI tools (such as XD) which do not allow any backend connections.
This is a problem.
No tool so far easily meets this requirement.
The backend we are using is called Firebase and the Realtime database integrates easily with HTML projects. What that has enabled is a "remote control" function where a separate mobile App is the "behind the screen wizard". So we can emulate situations and sensory data which would be much more difficult if done for real. An example is proximity:
A user walks up to a kiosk - the kiosk detects the presence of a user and reacts accordingly.
Now to actually implement this in a prototype is too hard and slow. Instead we have another person with the remote control press a button - signalling via firebase that such an event occured etc. The Kiosk App then receives that signal in real time and and it creates the illusion of smart sensors which in fact we do not have
So the dilemma is that Firebase is an easy integration with HTML, but not all students are fluent enough in coding to do this. I managed to integrate with Adobe Animate, but there were some problems, which again would challenge the students unnecessarily.
A more appropriate solution would be a killer feature for XD which would allow for a backend such as Firebase to trigger events. Or even a consideration of an Adobe backend, or a bespoke mechanism for this type of experience prototyping
I think that since Animate is quite close in allowing the easy integration, it would be possible as well in XD, but aimed at less code- literate UX designers
The feature could also be integrated as part of other feature requests, and I have commented on such requests as widgets which use Javascript for example. All you need is the global import and customized global JS, as well as some JS controlling of screen widgets / microinteractions etc.
I know its a big ask, but unless we find a product with this feature, we will not be 100% behind any offering, and painful workarounds leave a bad feeling
I do think that the XD product is probably in a position to include such a facility via general JS integration as described, which also solves a lot of other problems, such as live data, or choosing a different backend, or multiple users interaction via messaging on their prototypes.