<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:media="http://search.yahoo.com/mrss/"><channel><title><![CDATA[Trend Report]]></title><description><![CDATA[Connect, Learn, and Share]]></description><link>https://trends.techlab.works/</link><generator>Ghost 3.20</generator><lastBuildDate>Tue, 24 Mar 2026 20:03:10 GMT</lastBuildDate><atom:link href="https://trends.techlab.works/rss/" rel="self" type="application/rss+xml"/><ttl>60</ttl><item><title><![CDATA[Streamline your design process with these AI-Powered Figma Plugins!]]></title><description><![CDATA[<p>Something every designer would love to hear is,"This can make your job easier!", or "This will give you an unfair advantage!". Now that Artificial Intelligence is <a href="https://theresanaiforthat.com/">literally being used everywhere</a>, I would love to share some of the AI-powered <a href="https://www.figma.com/">Figma</a> plugins that I've used and tested personally that can</p>]]></description><link>https://trends.techlab.works/10-ai-figma-plugins/</link><guid isPermaLink="false">644a0654bd93d2000162236e</guid><dc:creator><![CDATA[Darren Phung]]></dc:creator><pubDate>Thu, 27 Apr 2023 06:26:42 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2023/04/Robot-Working-on-computer-1.png" medium="image"/><content:encoded><![CDATA[<img src="https://trends.techlab.works/content/images/2023/04/Robot-Working-on-computer-1.png" alt="Streamline your design process with these AI-Powered Figma Plugins!"><p>Something every designer would love to hear is,"This can make your job easier!", or "This will give you an unfair advantage!". Now that Artificial Intelligence is <a href="https://theresanaiforthat.com/">literally being used everywhere</a>, I would love to share some of the AI-powered <a href="https://www.figma.com/">Figma</a> plugins that I've used and tested personally that can help streamline your design workflow.</p><p>1. Ando AI </p><p><a href="https://www.figma.com/community/plugin/1145446664512862540/Ando---AI-Copilot-for-Designers">(Download the Plugin)</a></p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2023/04/image.png" class="kg-image" alt="Streamline your design process with these AI-Powered Figma Plugins!" srcset="https://trends.techlab.works/content/images/size/w600/2023/04/image.png 600w, https://trends.techlab.works/content/images/size/w720/2023/04/image.png 720w"></figure><p>Ando AI helps you generate millions of design ideas from prompts, shapes, and images. This can help designers fill up their mid/high-fidelity wireframes with relevant custom images and visual assets that can be used to make your wireframes more beautiful. (P.S. The post image was created using Ando AI!)</p><p>2. SPELLL</p><p><a href="https://www.figma.com/community/plugin/754026612866636376/SPELLL---Spelling-%26-Grammar-Checking-for-Figma-%26-FigJam">(Download the Plugin)</a></p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2023/04/image-1.png" class="kg-image" alt="Streamline your design process with these AI-Powered Figma Plugins!" srcset="https://trends.techlab.works/content/images/size/w600/2023/04/image-1.png 600w, https://trends.techlab.works/content/images/size/w1000/2023/04/image-1.png 1000w, https://trends.techlab.works/content/images/size/w1600/2023/04/image-1.png 1600w, https://trends.techlab.works/content/images/size/w2048/2023/04/image-1.png 2048w"></figure><p>Nothing hurts more than presenting your beautiful wireframes, and you catch spelling mistakes smack bang in the middle of your presentation. Embarrasing. It makes me wonder why companies dont make propreitary spell check for their design tools. As chronic mis-speller, this plugin is paramount to my design process. This will help a designer check spelling and even grammar and vocabulary, and then fix them with a single click.</p><p>3. MagiCopy</p><p><a href="https://www.figma.com/community/plugin/1184110746118034942/MagiCopy-%E2%80%93-AI-Text-Generator">(Download the Plugin)</a></p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2023/04/99d25042-d763-493f-b891-cba937adc0a8-cover-min.png" class="kg-image" alt="Streamline your design process with these AI-Powered Figma Plugins!" srcset="https://trends.techlab.works/content/images/size/w600/2023/04/99d25042-d763-493f-b891-cba937adc0a8-cover-min.png 600w, https://trends.techlab.works/content/images/size/w1000/2023/04/99d25042-d763-493f-b891-cba937adc0a8-cover-min.png 1000w, https://trends.techlab.works/content/images/size/w1600/2023/04/99d25042-d763-493f-b891-cba937adc0a8-cover-min.png 1600w, https://trends.techlab.works/content/images/size/w1920/2023/04/99d25042-d763-493f-b891-cba937adc0a8-cover-min.png 1920w"></figure><p>Tired of thinking of a jaw-dropping punchline for you landing page? With MagiCopy you're able to generate copy for your landing page according to the industry and product you are designing for. If you dont like generated copy, you can always use it as a guide or inspiration to finetune your finals punchline.</p><p>4. Stark </p><p><a href="https://www.figma.com/community/plugin/732603254453395948">(Download the Plugin)</a></p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2023/04/Stark-Tutorial-Cover.png" class="kg-image" alt="Streamline your design process with these AI-Powered Figma Plugins!" srcset="https://trends.techlab.works/content/images/size/w600/2023/04/Stark-Tutorial-Cover.png 600w, https://trends.techlab.works/content/images/size/w950/2023/04/Stark-Tutorial-Cover.png 950w"></figure><p>Stemming from my previous article about WCAG Compliance. One of the easiest ways to not be compliant is the incorrect use of colour. Stark is the Figma version of <a href="https://webaim.org/resources/contrastchecker/">https://webaim.org/resources/contrastchecker/</a> where you can see real-time of the contrast score for colours and different sizes of text.</p><hr><p>As time goes by and AI advancements in Figma Plugin progress I will update this article in the future on the useful plugins that I will end up using. </p><p>If you found these useful, please do share with your fellow designers. </p>]]></content:encoded></item><item><title><![CDATA[Nothing is happening in AI and ML right now...]]></title><description><![CDATA[<p>Well, well, well, what do we have here? Another trend report on the exciting world of Artificial Intelligence? Oh boy, I can hardly contain my excitement. I mean, let's face it, nothing interesting at all has happened in the world of AI recently, nope, nothing at all. Just kidding, of</p>]]></description><link>https://trends.techlab.works/nothing-is-happening-in-ai-and-ml/</link><guid isPermaLink="false">63bf81ebbd93d20001622259</guid><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Thu, 12 Jan 2023 05:24:28 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2023/01/Screenshot-2023-01-12-162350.png" medium="image"/><content:encoded><![CDATA[<img src="https://trends.techlab.works/content/images/2023/01/Screenshot-2023-01-12-162350.png" alt="Nothing is happening in AI and ML right now..."><p>Well, well, well, what do we have here? Another trend report on the exciting world of Artificial Intelligence? Oh boy, I can hardly contain my excitement. I mean, let's face it, nothing interesting at all has happened in the world of AI recently, nope, nothing at all. Just kidding, of course. AI is a constantly evolving field, and there's always something new and exciting happening.</p><p>One area of AI that has been making waves recently is the field of language generation, and more specifically, the development of GPT-3 based models. GPT-3, or Generative Pre-trained Transformer 3, is a state-of-the-art language generation model developed by OpenAI. The model has been trained on an enormous dataset of text, and it is able to generate human-like text with remarkable coherence and fluency. The results are so impressive, in fact, that some people have even dubbed GPT-3 the "AI writing assistant."</p><p>ChatGPT is a variation of GPT-3. It's designed to generate more natural, conversational text. It can understand the context of what is being said, allowing it to respond in a more human-like way. This ability makes ChatGPT a powerful tool for tasks such as chatbots, conversational AI, and language translation. With ChatGPT, you can generate text that reads as if it were written by a human, in most of the cases even more coherent and natural than text written by a human.</p><p>In this article, we will take a closer look at ChatGPT and explore its potential use cases and implications for a wide range of industries. From automating customer service chatbots to helping write news articles, ChatGPT is set to have a big impact on the way we interact with language-based technology in the future. So, sit back, relax, and enjoy the ride as we delve into the world of ChatGPT.</p><p>This opening section, of course, was entirely written by ChatGPT, responding to the prompt:</p><!--kg-card-begin: markdown--><blockquote>
<p>Write an intro to chatgpt for a TechLab trend report article, it should be conversational, some sarcasm and humour is ok, for example &quot;noting interesting at all has happened in the world of AI recently, nope nothing at all&quot;. Trend report articles generally are informational for academics and university staff in fields other than technology.</p>
</blockquote>
<!--kg-card-end: markdown--><p>ChatGPT is just one of many tools that have emerged in recent months, but it has certainly been one of the most discussed. From its use to refine and <a href="https://arstechnica.com/information-technology/2023/01/chatgpt-is-enabling-script-kiddies-to-write-functional-malware/">obfuscate malicious code</a>, <a href="https://ia.acs.org.au/content/ia/article/2023/chatgpt-could-be--nuclear-weapon--for-cyber-war.html">improve viruses to avoid detection</a>, and advice on how to create explosives to undertake acts of terrorism. Though Open AI, the company behind ChatGPT, has taken steps in recent weeks to curb the "exploits" that grant access to more pernicious responses. </p><p>Indeed, the capacity for fun with the system is near limitless. I have had it developing content for emails, communications, of course this article, but also more creative endeavours like: </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image.png 600w, https://trends.techlab.works/content/images/size/w813/2023/01/image.png 813w"><figcaption>A Truly beautiful sonnet.</figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image-1.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image-1.png 600w, https://trends.techlab.works/content/images/size/w787/2023/01/image-1.png 787w"><figcaption>The capability of ChatGPT is truly biblical in scale</figcaption></figure><p> <br>I also decided to use it for my own benefit in salary negotiations and came up with the following : </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image-4.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image-4.png 600w, https://trends.techlab.works/content/images/size/w825/2023/01/image-4.png 825w"><figcaption>please stand up</figcaption></figure><p>One of the benefits of ChatGPTs model is that it remembers all the things that you ask it. I wasn't happy with the rap above because I don't work in sales, so I provided it some more context:</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image-5.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image-5.png 600w, https://trends.techlab.works/content/images/size/w793/2023/01/image-5.png 793w"><figcaption>Ok your move Sanjay</figcaption></figure><p>I'll include a comprehensive gallery of other examples below, I have used it to write code, come up with business ideas, rewrite segments of text, come up with copy and monologues for my Dungeons and Dragons campaign, to generate a walking tour around Tokyo which I then asked it to convert to a leaflet.js map of points of interest and many more. Suffice to say for lots of tasks, ChatGPT will revolutionise your life, but it is important that you acknowledge the big elephant in the room. ChatGPT doesn't actually know what the truth is, but it will confidently tell you that it knows.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image-6.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image-6.png 600w, https://trends.techlab.works/content/images/size/w810/2023/01/image-6.png 810w"><figcaption>Points for confidence.</figcaption></figure><p>A warning for language, but the academic paper "<a href="https://journal.sjdm.org/15/15923a/jdm15923a.html">On the reception and detection of pseudo-profound bull$#!t</a>" sums it up nicely:</p><blockquote>“It is impossible for someone to lie unless he thinks he knows the truth. Producing bull$#!t requires no such conviction.” – Harry Frankfurt</blockquote><p>ChatGPT doesn't know the answer, so it can't lie. It just sometimes spouts bull crap. Behind the scenes, of course, we know that ML models of this nature generally have a confidence or uncertainty measure, and in future iterations this may be available to the consumer, but as of now you don't know when you are being told the truth, or a rough approximation of it. ChatGPT is currently free, so is attracting a lot of attention as people test it and push its limits, which can only lead to a better product in the long run, but it isn't the only AI tool that is making waves right now. </p><p>Generative Art is an area that is currently receiving mixed reviews. The quality of the outputs is unarguably marvellous, but the ethics of the model training leave a lot to be questioned. At the very least, you sometime see a ghostly signature on the generated image, at worst it's a direct copy. Among the best of the current offering is Midjourney, able to generate in a multitude of styles and at acceptable print resolutions, and delivered through a Discord bot that allows for generation of images anywhere. It is not limited by style and can master some very complex themes and narrative elements.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image-7.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image-7.png 600w, https://trends.techlab.works/content/images/size/w740/2023/01/image-7.png 740w"><figcaption><strong>A<strong> photo of a University student in an exam, facing the camera, panicking about plagiarism in their test</strong></strong></figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2023/01/image-8.png" class="kg-image" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/image-8.png 600w, https://trends.techlab.works/content/images/size/w754/2023/01/image-8.png 754w"><figcaption>4 options for <strong><strong>product design and packaging for a bachelors degree from a prestigious university.</strong></strong></figcaption></figure><p>For work in presentations or to communicate inspiration with a designer or artist, this tool is incredibly powerful and has formed a large part of how I describe locations for my Dungeons and Dragons game as well as discussions with my tattoo artist about what styles I'm interested in, but at this stage I think it's best to avoid commercial applications, despite the licence allowing it.</p><p>In the last 3 months we have tried over 100 different AI tools across a huge range of areas, from Text to Speech, to Text to Video, to Music, to Translation and more, and each is worthy of its own Trend report article... but you can find an ever-growing list at <a href="https://www.futuretools.io/">https://www.futuretools.io/</a>, but it's by no means exhaustive. </p><p>While Language Generation and Image Generation seem to be the most present, the coming year will be both exciting and terrifying as the machines start to take over. As we look towards the future, it will be interesting to see how these AI-powered  tools continue to evolve and how they will be used in various industries. It's exciting to think about all the possibilities, but it's also important to remember to tread carefully and use these tools for the betterment of humanity.</p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/1.png" width="785" height="734" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/1.png 600w, https://trends.techlab.works/content/images/size/w785/2023/01/1.png 785w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/2.png" width="802" height="932" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/2.png 600w, https://trends.techlab.works/content/images/size/w802/2023/01/2.png 802w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/3.png" width="822" height="532" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/3.png 600w, https://trends.techlab.works/content/images/size/w822/2023/01/3.png 822w"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/MicrosoftTeams-image--11-.png" width="810" height="1632" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/MicrosoftTeams-image--11-.png 600w, https://trends.techlab.works/content/images/size/w810/2023/01/MicrosoftTeams-image--11-.png 810w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/MicrosoftTeams-image--12-.png" width="808" height="1687" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/MicrosoftTeams-image--12-.png 600w, https://trends.techlab.works/content/images/size/w808/2023/01/MicrosoftTeams-image--12-.png 808w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/Screenshot-2023-01-12-161031.png" width="805" height="524" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/Screenshot-2023-01-12-161031.png 600w, https://trends.techlab.works/content/images/size/w805/2023/01/Screenshot-2023-01-12-161031.png 805w"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/Screenshot-2023-01-12-161241.png" width="815" height="1362" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/Screenshot-2023-01-12-161241.png 600w, https://trends.techlab.works/content/images/size/w815/2023/01/Screenshot-2023-01-12-161241.png 815w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/Screenshot-2023-01-12-161537.png" width="803" height="1277" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/Screenshot-2023-01-12-161537.png 600w, https://trends.techlab.works/content/images/size/w803/2023/01/Screenshot-2023-01-12-161537.png 803w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2023/01/Screenshot-2023-01-12-162014.png" width="811" height="1213" alt="Nothing is happening in AI and ML right now..." srcset="https://trends.techlab.works/content/images/size/w600/2023/01/Screenshot-2023-01-12-162014.png 600w, https://trends.techlab.works/content/images/size/w811/2023/01/Screenshot-2023-01-12-162014.png 811w"></div></div></div></figure>]]></content:encoded></item><item><title><![CDATA[Web Inclusivity: Considering WCAG Compliance in our applications.]]></title><description><![CDATA[<p>When I started my degree in Design Computing in 2017, I never really learned or talked about software accessibility or inclusive design. I only focused on the aesthetic aspects of UX/UI design, the only thing in my mind that I thought could give me the most marks. Now working</p>]]></description><link>https://trends.techlab.works/web-inclusivity-considering-wcag-compliance-in-our-applications/</link><guid isPermaLink="false">62a041a850ae570001b399d0</guid><dc:creator><![CDATA[Darren Phung]]></dc:creator><pubDate>Thu, 09 Jun 2022 06:30:32 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2022/06/6583-min-1.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://trends.techlab.works/content/images/2022/06/6583-min-1.jpg" alt="Web Inclusivity: Considering WCAG Compliance in our applications."><p>When I started my degree in Design Computing in 2017, I never really learned or talked about software accessibility or inclusive design. I only focused on the aesthetic aspects of UX/UI design, the only thing in my mind that I thought could give me the most marks. Now working professionally for 2 years at TechLab, and constantly deploying applications to the broader audience, inclusive design has been in the forefront of my design process, specifically methods on bringing WCAG compliance to our applications. I would like to share some of the processes and tools that go towards making applications WCAG compliant.</p><hr><p><u><strong>What is WCAG?</strong></u></p><p><a href="https://www.w3.org/WAI/standards-guidelines/wcag/">WCAG (Web Content Accessibility Guidelines)</a> are defined by W3C (World Wide Web Consortium), an international community that develops online standards. This set of guidelines are meant to enhance mobile and web functionality to be friendlier and give end-users extreme flexibility. Due to the dynamic nature of mobile and web technologies, WCAG guidelines are regularly updated to meet those new standards. WCAG 2.0 is used mostly for web accessibility – but the latest version, 2.1 provides support of accessibility on mobile devices.</p><p></p><p><u><strong>Requirements for making a site WCAG-compliant</strong></u></p><p>To ensure that your application meets the WCAG standard, it must follow the four POUR design principles:</p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2022/06/image-2.png" class="kg-image" alt="Web Inclusivity: Considering WCAG Compliance in our applications." srcset="https://trends.techlab.works/content/images/size/w600/2022/06/image-2.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/06/image-2.png 1000w, https://trends.techlab.works/content/images/size/w1081/2022/06/image-2.png 1081w"></figure><ul><li><strong>Perceivable:</strong> Information and user interface components (interactive links, text boxes, buttons, and so on) must be presented in a way that all users can understand. If the applications useful content is completely imperceptible to any of its users, it fails the perceivability test.</li><li><strong>Operable:</strong> User interface components and navigation must be operable.</li><li><strong>Understandable:</strong> Information and the operation of user interface must be understandable. If a user cannot grasp how a website works or what its information means, it fails the understandability test.</li><li><strong>Robust:</strong> Content must be robust enough that it can be interpreted reliably by a wide variety of user agents, for example, not only standard web browsers but also third-party assistive technologies such as screen readers.</li></ul><p></p><p><u><strong>How do I make applications WCAG-compliant?</strong></u></p><p>WCAG fortunately breaks down the POUR design principles into several lower-level guidelines for specific topics and further dissects each guideline into a set of success criteria, which can act as a checklist for compliance.</p><p>The success criteria are also classified under three levels: A, AA, and AAA, with <strong>A</strong> providing the most basic level of accessibility and <strong>AAA</strong> the most comprehensive. In most cases, I only meet the demands of WCAG compliance to up to <strong>AA</strong>. Though it’s beyond the scope of this article to list the guidelines and criteria in full, I use these checklist guides to make sure I am following all WCAG requirements.</p><ul><li><a href="https://www.w3.org/WAI/WCAG21/quickref/">“How to Meet WCAG” quick reference guide</a></li><li><a href="https://webaim.org/standards/wcag/checklist">WebAim's WCAG 2 Checklist</a></li></ul><p></p><p><u><strong>Tools for testing accessibility compliance</strong></u></p><p><em>Accessibility tools for checking color contrast</em></p><ol><li><a href="https://webaim.org/resources/contrastchecker/">https://webaim.org/resources/contrastchecker/</a> is the most basic tool in terms of checking color contrast. I use this tool the most as it influences me on the use of color, early in the design process, to check on things such as buttons, banners, images, etc.</li></ol><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2022/06/image-1.png" class="kg-image" alt="Web Inclusivity: Considering WCAG Compliance in our applications." srcset="https://trends.techlab.works/content/images/size/w600/2022/06/image-1.png 600w, https://trends.techlab.works/content/images/size/w973/2022/06/image-1.png 973w"></figure><p><em>Accessibility tools for checking the code</em></p><ol><li><a href="https://wave.webaim.org/extension/">WAVE Browser extension</a> for Chrome and Firefox will check a range of features, including color contrast, the use of alt text, Accessible Rich Internet Applications (ARIA) labels, form features, Document Object Model (DOM) structure, missing titles, duplicate content and much more. The tool also provides tips on how to fix the found errors and references to accessibility guidelines too. We find it very useful for quickly checking heading hierarchy in the Outline view.</li><li><a href="https://achecker.ca/">AChecker</a> is an online tool for testing live websites, which also provides results grouped by Guidelines.</li><li><a href="https://getpericles.com/">Pericles</a> is an online screen reader tool. I use this to check the order in which text is read.</li></ol><p><em>Accessibility tools for testing keyboard navigation</em></p><ol><li><a href="https://www.mozilla.org/en-US/firefox/">Mozilla Firefox</a> is a great tool for checking if the website can be used with a keyboard. To enable keyboard navigation, locate and enable the “Always use the cursor keys to navigate within pages” setting in your Settings or Preferences panel.</li></ol><p></p><p><u><strong>Designing for Dignity</strong></u></p><p>Designing for dignity is about ensuring people of all abilities can access an application in a way that is equitable for all. It is about designing a workflow with the expectation that people can  access the application like everyone else, regardless of <strong>mental or physical</strong> disability. However, it is sometimes impossible to entirely gauge a user’s lived experiences. In terms of a recent TechLab project, <a href="https://www.myrecoveryjournal.techlab.works/">PersonalRecoveryPlan</a>, there are several factors that can contribute to an at-risk user to trigger a mental crisis. This means the application will have to take measures in supporting that user if that adversity strikes. These measures include designing coherent buttons and workflows for users to access external and internal resources that may assist their crisis.</p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2022/06/Frame-2.png" class="kg-image" alt="Web Inclusivity: Considering WCAG Compliance in our applications." srcset="https://trends.techlab.works/content/images/size/w600/2022/06/Frame-2.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/06/Frame-2.png 1000w, https://trends.techlab.works/content/images/size/w1600/2022/06/Frame-2.png 1600w, https://trends.techlab.works/content/images/size/w1920/2022/06/Frame-2.png 1920w"></figure><p></p><p><u><strong>Why does it matter?</strong></u></p><p>Dignified access for all users allows them to access and navigate an application/website on the same basis as their colleagues and other users. To the point where they can do it independently. Taking these comprehensive measures represent the crucial ethos to build a better and more inclusive web. Presently this is most prevalent as 1<a href="https://www.who.int/teams/noncommunicable-diseases/sensory-functions-disability-and-rehabilitation/world-report-on-disability">5% of the world’s population has some sort of disability</a> and <a href="https://www.who.int/news-room/fact-sheets/detail/mental-disorders">1 in 8 people currently suffer from mental or neurological disorder</a> (WHO).</p><p>The University of Sydney also has a large presence in improving access and inclusion for people with disabilities. This is arranged in the <a href="https://www.sydney.edu.au/content/dam/corporate/documents/about-us/values-and-visions/disability-inclusion-action-plan-2019-24/disability-inclusion-action-plan-2019-24.pdf">Disability Inclusion Access Plan</a> which is reviewed and renewed every 6 years. The Disability Inclusion Access Plan<strong> </strong>is a strategy for changing practices which may result in the discrimination against people with disability. It identifies and removes barriers for people with disability to fully participate at the University, including its learning, teaching, physical, digital, living and communication environments. Digitally, the plan saw the roll-out of accessible communications technology across the University, including the installation of accessibility software across student computing spaces, and all University of Sydney web pages being progressively updated to meet WCAG standards.</p><hr><p>As an innovative team and environment at Techlab, we don’t only pave the way in emerging technologies and processes, but we also sustain the values that The University of Sydney maintains in the space of inclusivity, dignity and accessibility in all our products.</p>]]></content:encoded></item><item><title><![CDATA[Virtual Tours, a pandemic solution with wider implications]]></title><description><![CDATA[<p></p><p>Virtual tours have been in use in marketing and real estate as interactive, informative approaches to immersing customers in a space, primarily in sales and marketing. However, broader use cases in education and training continue to come to the fore.</p><p>Providing site-specific learning experiences at scale, both synchronously and asynchronously,</p>]]></description><link>https://trends.techlab.works/virtual-tours-a-pandemic-solution-with-wider-implications/</link><guid isPermaLink="false">6218803a1e483b0001c2f52b</guid><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Wed, 16 Mar 2022 03:37:39 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2022/03/panorama_96BFB618_854C_03E9_41DA_FCAED009975C_hd_t.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://trends.techlab.works/content/images/2022/03/panorama_96BFB618_854C_03E9_41DA_FCAED009975C_hd_t.jpg" alt="Virtual Tours, a pandemic solution with wider implications"><p></p><p>Virtual tours have been in use in marketing and real estate as interactive, informative approaches to immersing customers in a space, primarily in sales and marketing. However, broader use cases in education and training continue to come to the fore.</p><p>Providing site-specific learning experiences at scale, both synchronously and asynchronously, while measuring the engagement of learners, is the most difficult challenge the pandemic has imposed on educators. With the implementation of digital tours described here, we can deliver rich and pedagogically sound experiences, expanding the reach of tertiary learning beyond the cohort of students able to attend in person. While assuring a baseline equitable experience for all students, these experiences are accessible from a browser or compatible with VR headsets.</p><p>It starts with photos and videos - the technology hasn't shifted massively in 2020-22, so as a starting point for some equipment employed here, you will want to reference <strong><a href="https://trends.techlab.works/what-weve-learned-from-7-years-of-360-video/">What we've learned from 7 years of 360 video</a></strong>. Since that article was published, the Insta-360 one R has formed a solid foundation for our mixed reality and 360 platforms.</p><h3 id="virtual-site-visits">Virtual Site Visits</h3><p>EPIC lab is a Research laboratory<em> </em>using 3D tech to improve global quality of care around the world, located at Children's Hospital at Westmead.<em> </em>As part of the Applied Medicine Major, students previously attended this site to learn about prototyping, product design, and regulatory compliance for medical devices. Hospital site visits were highly limited with the strict procedures introduced because of the COV_SARS 2 Pandemic. <a href="https://epic.techlab.works"><strong>https://epic.techlab.works</strong></a> was developed across a few days to allow students to visit the laboratory virtually and see some of the technology deployed there. Combined with online discussion and tutorial participation via zoom, the lack of access was supplemented.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/Screenshot-2022-03-09-142314.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/Screenshot-2022-03-09-142314.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/Screenshot-2022-03-09-142314.png 1000w, https://trends.techlab.works/content/images/size/w1074/2022/03/Screenshot-2022-03-09-142314.png 1074w"><figcaption>One view of the epic lab in 360 degrees VR</figcaption></figure><h3 id="tutorial-lab-inductions">Tutorial Lab inductions</h3><p>Since travel was restricted in early 2020, many students for the semester (and the 5 semesters since) were unable to travel to Sydney campus; this meant that in classes with heavy tutorial focused curriculum, students would be undertaking these studies virtually.  While some experiments and tutorial activities for electrical engineering students could use online toolkits, the actual environments of the labs are a core part of the study. Our solution to this was to create a similar virtual tour for each of the labs at <strong><a href="https://eielabs.techlab.works/">https://eielabs.techlab.works/</a></strong>  highlighting the equipment that students would be using online simulated versions of.   We also included the floor plan of the J03 electrical engineering building, for context, and to connect the students to campus, as well as induction to points of safety.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/Screenshot-2022-03-09-142223.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/Screenshot-2022-03-09-142223.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/Screenshot-2022-03-09-142223.png 1000w, https://trends.techlab.works/content/images/size/w1457/2022/03/Screenshot-2022-03-09-142223.png 1457w"><figcaption>An Electrical Engineering Lab in 360</figcaption></figure><h3 id="urban-ecology-field-trips-online">Urban Ecology field trips online</h3><p>Walking around for many hours to investigate places where people congregate was simply not viable during the pandemic, In Urban ecology, both undergrad and postgrad units involved groups of 150-400 students per semester traversing 12-15 km, walking in groups across different zones on campus. This entailed note-taking and producing reports (on buildings, sustainability and green areas) from these site visits for tutorial activities and assessments. While the pandemic a blocker for large groups to gather on campus, there was already a pre-existing inequity issue for students with disabilities or special needs to participate with the same level of rigour as their peers.

</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/Screenshot-2022-03-09-142715.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/Screenshot-2022-03-09-142715.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/Screenshot-2022-03-09-142715.png 1000w, https://trends.techlab.works/content/images/size/w1057/2022/03/Screenshot-2022-03-09-142715.png 1057w"><figcaption>The landing page for the tour of BADP2005 (City Design and Urban ecology)</figcaption></figure><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/Screenshot-2022-03-09-142902.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/Screenshot-2022-03-09-142902.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/Screenshot-2022-03-09-142902.png 1000w, https://trends.techlab.works/content/images/size/w1034/2022/03/Screenshot-2022-03-09-142902.png 1034w"><figcaption>A GPS Tracked video tour of Sydney Green spaces for ARCH9080(Urban Ecology, Design and Planning)</figcaption></figure><h3 id="museums-and-object-based-learning">Museums and object based learning </h3><p>The Chau Chak Wing museum opened in 2020, resplendent with nearly half a million collection items. Many academics had begun the work of integrating <a href="https://mgnsw.org.au/sector/resources/online-resources/education/object-based-learning-school-groups-museums/">object-based learning</a> into curriculum; museum interactions were slated to be a key part of multi-faculty graduate experiences. The dedicated learning spaces developed to allow students to interact and observe ancient artefacts, ethnographic exhibits, science and historic photography and contemporary artworks. </p><p>70% of the items on display in the CCWM were previously locked away in storage for 20+ years and sizeable resource investment went into having them ready for learning and research - the challenge and opportunity was to now have the <em>online environment compliment the built environment</em>. In order to have our students engage with the museum, we co-developed a tour of some of the more interesting artefacts with students, and tie them to their studies. We worked to produce multiple versions for Arts, Science and Business, plus a communal version the museum could use in their communications and outreach. E.g., <a href="https://usydcampus.techlab.works/">https://usydcampus.techlab.works/</a> includes prompts for students to engage and undertake assessment.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/Screenshot-2022-03-09-142412.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/Screenshot-2022-03-09-142412.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/Screenshot-2022-03-09-142412.png 1000w, https://trends.techlab.works/content/images/size/w1360/2022/03/Screenshot-2022-03-09-142412.png 1360w"><figcaption>The Floor plan to the CCWM, delivered via the web browser with interactive 360 exhibits.</figcaption></figure><h3 id="extending-into-training">Extending into training</h3><p>With high-fidelity 360 video comes the opportunity to implement training and compliance. Ranging from <a href="https://food3001.techlab.works/"><strong>Virtual Food Science labs</strong></a>, Dental Radiography, <a href="https://cadigalgreen.techlab.works/"><strong>Cadigal Green land</strong></a> learning, to 'Farm to Table' Supply Chain, we have created immersive experiences to train students and even industry partners and embed them in spaces that would otherwise be difficult or impossible. E.g., the University has only four Dental Radiography suites for use by several hundred students. However, using VR, we can grant them access to quizzes and tests for self reflection with direct access to the spaces virtually at the same time.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/image.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/image.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/image.png 1000w, https://trends.techlab.works/content/images/size/w1306/2022/03/image.png 1306w"><figcaption>Demonstration of the Deadman Switch for Periapical Radiography at the Sydney Dental School, Westmead</figcaption></figure><h3 id="connectedness-to-campus">Connectedness to campus</h3><p><a href="https://www.sydney.edu.au/courses/units-of-study/2022/fass/fass1000.html">FASS1000 </a>- a foundational unit for ALL commencing students of the faculty of Arts and Social Science, was originally conceived with an instructor led walkthroughs of different parts of campus; covering aspects like space, place, history, activism, culture and community; to build a sense of connectedness to the University of Sydney. Not only did we v<a href="https://oursydney.techlab.works/">irtualise this experience, but also equipped students with the ability to create and share their own journey-map</a> - THe Unit of study won the 2021 <a href="https://www.sydney.edu.au/about-us/our-story/vice-chancellor-awards-2021.html">Vice Chancellor's award</a> for innovation and excellence in teaching.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2022/03/image-8.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/image-8.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/image-8.png 1000w, https://trends.techlab.works/content/images/size/w1223/2022/03/image-8.png 1223w"><figcaption>"Our Sydney" platform highlighting select points of significance on campus with embedded multimedia</figcaption></figure><h3 id="evolution-of-teaching-platforms-in-3d">Evolution of teaching platforms in 3D</h3><p>All previous examples share common barriers - students/educators cannot create them without significant technical training (e.g., React and JavaScript based tours, complex 3D modelling tool sets), or significant costs in licenses, in the case of the Museum and Radiography examples. 3D Vista software allows rendering 360 experiences, but still requires a Sysadmin to deploy it via AWS or other similar infrastructure. Additionally, while we might password-protect web services, or implement a web access gateway for students/staff to use their university credentials for access, we can't measure granular engagement by users for the purpose of auditing, or for assessment.</p><p>To that end, we have been increasing the utility of our in-house VR asset management tool set: <strong><a href="https://etakidev.techlab.works/">Eta-Ki</a>.</strong> </p><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2022/03/image-7.png" class="kg-image" alt="Virtual Tours, a pandemic solution with wider implications" srcset="https://trends.techlab.works/content/images/size/w600/2022/03/image-7.png 600w, https://trends.techlab.works/content/images/size/w1000/2022/03/image-7.png 1000w, https://trends.techlab.works/content/images/size/w1305/2022/03/image-7.png 1305w"></figure><p>Eta-Ki now supports 360 image/video annotation, creating collections, linking to associated Assets, allowing a "virtual tour" to occur entirely inside the application. This avails the benefit of granular navigation and interaction metrics linked to each user, plus the ability to protect intellectual property behind the university identity system. Eta-Ki also supports gigapixel imagery and 25 formats of 3D models, which otherwise pose cross-platform compatibility challenges. Now extended it to generate gigapixel imagery from Microscopy slides, it is built entirely 'serverless' and already in use in 4 units of study, 6 more planning to go live in soon. </p><p>While bespoke solutions and off-the-shelf tools like 3D Vista help in the short term, they are not sustainable as a delivery of comprehensive XR interactions, As Eta-Ki grows, we hope to have that solution in place enabling students, educators, Academics and community members to experience the University of Sydney, virtually.</p>]]></content:encoded></item><item><title><![CDATA[Digital Fluency: a graduate quality]]></title><description><![CDATA[While digital literacy is important, the leaders of tomorrow will have digital fluency, the ability to navigate and utilise software for creativity, story telling and problem-solving, and it will continue to be a powerful tool for the highest achievers. ]]></description><link>https://trends.techlab.works/digital-fluency-a-graduate-quality/</link><guid isPermaLink="false">61972e18b8877b0001fc9852</guid><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Thu, 25 Nov 2021 04:56:53 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-120732-2.png" medium="image"/><content:encoded><![CDATA[<img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-120732-2.png" alt="Digital Fluency: a graduate quality"><p><a href="https://unity.com/">Unity </a>is a game development engine that is not only leveraged by AAA Game Studios, but increasingly by other industries as a communication and interaction platform. Thanks to the<a href="https://unity.com/products/unity-academic-alliance"> Unity Academic Alliance</a>, we took the opportunity to integrate it into learning and teaching and trial it as a pedagogical tool, empowering the Medical Science curriculum through creativity and digital Fluency.</p><p>While digital literacy is important, the leaders of tomorrow will require digital fluency – the ability to navigate and utilise software for creativity, storytelling and problem-solving, and set the foundation for continued learning outside of their academic degrees. </p><p>For the past 2 years, we have been evolving a game development curriculum inside medical sciences. There are 3 reasons we have undertaken this journey;</p><ol><li><em>Graduate outcomes</em>:<br>The University of Sydney's Education Strategy propounds a list of graduate qualities, among which are inventiveness, critical thinking and problem-solving, information and digital literacy and interdisciplinary effectiveness. We believe that game design and development greatly supports these capabilities.</li><li><em>The memory palace technique</em>:<br>Modern research shows that high performing mental athletes, the sort of individuals who engage in trivia competitions and game shows as a career, learn new information; they engage regions of the brain known to be involved in two specific tasks: visual memory and spatial navigation. (insert graphic of mind palace).<br>They deliberately and purposefully encode the data with a visual representation and location within a kind of memory palace.<br>We believe that visually structuring learning and making it interactive makes it 'sticky' in the minds of learners.</li><li><em>Learning should be interesting, immersive and challenging</em> <br>The COVID pandemic has reinforced the fact that the 120-minute lecture is not conducive to engaged learning. Research on lecture watching statistics worldwide continue to trend towards shorter more involved and impactful chunks of information, and participation and co-creation as the true drivers of knowledge retention. We believe game design and development to be one of the most participatory learning experiences a student can undertake.</li></ol><p><strong>The Challenge</strong>:<br>Medical science students are not game developers, most have never heard of Unity and their experience even with software development is limited.</p><p>We scaffolded the game capability across 3 years, in the first instance giving them heavily templated projects to work on to increase familiarity with the tools. </p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2021/11/image-5.png" class="kg-image" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/image-5.png 600w, https://trends.techlab.works/content/images/size/w800/2021/11/image-5.png 800w"><figcaption>The prompt for the MEDS1001 Assessment</figcaption></figure><p>At first, we simply provided a template space for students to populate. This was loosely based on the shape of our existing Chau Chuk Wing museum, and would be recognisable to them. This was moderately effective, and introduced constraints and barriers. With a very diverse and talented cohort, the learning impact was varied, as it set a level of expectation but also made it difficult for students who wanted to flex their creativity. With that in mind, we began a capstone program with our Computer Science students to simplify the flow from web based architectural design, to Unity based interactive experience development.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2021/11/image-6.png" class="kg-image" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/image-6.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/image-6.png 1000w, https://trends.techlab.works/content/images/size/w1600/2021/11/image-6.png 1600w, https://trends.techlab.works/content/images/size/w2281/2021/11/image-6.png 2281w"><figcaption>The MedsUnited tool for generating Unity Scenes in a web browser.</figcaption></figure><p>Currently, we have limited exposure to Unity in the second year of the course, but we are working towards making it holistic. Throughout their second year, medical science students are exposed to a lot of complex information around anatomy, histology, pathology and a myriad of other challenging subjects. We have begun the work of creating 3D assets for a lot of these subject areas and storing them in our virtual asset management system Eta-Ki. While not limited to the medical sciences, our system caters to a massive number of file formats and allows for sharing and embedding in the University LMS system.</p><p>Third year is when the real challenging work begins: the capstone experience. This year, we really dialled in the approach for having students develop Interactive Unity experiences in small teams. The outcomes speak for themselves in terms of quality, but more so the student experience described in their own reflections and reports paints an engaged learning cohort, that understand the cross disciplinary application of the skills they have developed.</p><p>Furthermore, we have qualitative feedback indicating the value of this approach. Students entering the workforce with these skills contributing to their employability, and understanding that all jobs are technology jobs, from clinical roles, through to construction and facilities services.</p><p>These projects were delivered in a wide range of modalities, some projects were web based, some were games you installed, and still others were mobile applications designed to be used on the phone. As someone who has spent a lot of time making "apps", the speed at which real, high-quality products were delivered by non coders continues to impress me.</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-120732-1.png" width="1115" height="579" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-120732-1.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-120732-1.png 1000w, https://trends.techlab.works/content/images/size/w1115/2021/11/Screenshot-2021-11-23-120732-1.png 1115w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-120931.png" width="1208" height="768" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-120931.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-120931.png 1000w, https://trends.techlab.works/content/images/size/w1208/2021/11/Screenshot-2021-11-23-120931.png 1208w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-121047-1.png" width="1352" height="1250" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-121047-1.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-121047-1.png 1000w, https://trends.techlab.works/content/images/size/w1352/2021/11/Screenshot-2021-11-23-121047-1.png 1352w"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-121151.png" width="1301" height="950" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-121151.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-121151.png 1000w, https://trends.techlab.works/content/images/size/w1301/2021/11/Screenshot-2021-11-23-121151.png 1301w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-121239-1.png" width="2170" height="1067" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-121239-1.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-121239-1.png 1000w, https://trends.techlab.works/content/images/size/w1600/2021/11/Screenshot-2021-11-23-121239-1.png 1600w, https://trends.techlab.works/content/images/size/w2170/2021/11/Screenshot-2021-11-23-121239-1.png 2170w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-121356.png" width="1248" height="689" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-121356.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-121356.png 1000w, https://trends.techlab.works/content/images/size/w1248/2021/11/Screenshot-2021-11-23-121356.png 1248w"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-121657-1.png" width="1811" height="1228" alt="Digital Fluency: a graduate quality" srcset="https://trends.techlab.works/content/images/size/w600/2021/11/Screenshot-2021-11-23-121657-1.png 600w, https://trends.techlab.works/content/images/size/w1000/2021/11/Screenshot-2021-11-23-121657-1.png 1000w, https://trends.techlab.works/content/images/size/w1600/2021/11/Screenshot-2021-11-23-121657-1.png 1600w, https://trends.techlab.works/content/images/size/w1811/2021/11/Screenshot-2021-11-23-121657-1.png 1811w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2021/11/Screenshot-2021-11-23-123807-1.png" width="475" height="284" alt="Digital Fluency: a graduate quality"></div></div></div><figcaption>A selection of screenshots from our 2021 projects.</figcaption></figure><p>Moving forward we plan to expand this offering into other disciplines, as we understand that a driven and capable student cohort, will help create leaders in a range of verticals in the future, which will allow them to attack difficult and wicked problems from a truly cross disciplinary angle.</p><p>We also plan to continue to iterate the MedsUnited tool to better service the first year cohort. While the ability to code is important in the greater context of digital fluency, it's also important to give the students a sense of accomplishment, balancing this is where we will be focusing in the future. It feels very good to have created a real-world experience, just ask our students.<br></p><!--kg-card-begin: markdown--><blockquote>
<p>&quot;We submitted our project a few days ago, and we're all so happy with what we were able to produce.&quot;</p>
</blockquote>
<!--kg-card-end: markdown--><p></p><!--kg-card-begin: markdown--><blockquote>
<p>Just something I would like to also share with the teaching staff of MEDS1001, I had an interview for a full-time position at a medical company and one of the skill set they asked for was Unity. During the interview, I was able to show the interviewers our virtual gallery, and they were quite excited to see the virtual gallery and asked a few questions about the possibility of implementing VR. Tomorrow, I will be visiting their company to meet their engineers and attend their R&amp;D meeting. Once again, thank you MEDS1001 teaching staff for preparing students for the graduate qualities and state of the art programming skills that the employers are looking for.</p>
</blockquote>
<!--kg-card-end: markdown--><p>We are seeking permission to share these games with all of you, watch this space.</p>]]></content:encoded></item><item><title><![CDATA[Experimentation applied into COVID-responsive solutions]]></title><description><![CDATA[<h3 id="select-initiatives-q1-2020-to-q3-2021">Select initiatives Q1, 2020 to Q3, 2021</h3><p>While TechLab team largely focuses on experimentation in emerging technology space, COVID created a sense of urgency to build digital and/or virtual experiences that replaced or supplemented in-person experiences in areas of learning, research and even student onboarding and support/allied services.</p>]]></description><link>https://trends.techlab.works/techlab-initiatives-2020-21/</link><guid isPermaLink="false">60e2c6c3f107c50001ef58ab</guid><dc:creator><![CDATA[TechLab]]></dc:creator><pubDate>Tue, 06 Jul 2021 01:13:36 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2022/03/wmbb-site.jpg" medium="image"/><content:encoded><![CDATA[<h3 id="select-initiatives-q1-2020-to-q3-2021">Select initiatives Q1, 2020 to Q3, 2021</h3><img src="https://trends.techlab.works/content/images/2022/03/wmbb-site.jpg" alt="Experimentation applied into COVID-responsive solutions"><p>While TechLab team largely focuses on experimentation in emerging technology space, COVID created a sense of urgency to build digital and/or virtual experiences that replaced or supplemented in-person experiences in areas of learning, research and even student onboarding and support/allied services. </p><p>Not only did these need to be tactical (quick-fixes), but also adequately robust to be scalable with potential for continued use without significant investment of resources with technical know-how to support and maintain them.</p><p>In more simplistic, quick-builds, we started with applying our previously learned approaches in taking 360 images and videos and converting these to interactive virtual experiences. Key focus was also the<a href="https://trends.techlab.works/extended-reality/"><strong> AR, VR, MR, XR</strong> </a>solutions to be accessible from any modern browser, to maintain equity across remote or onsite students.</p><p>In other cases, the builds ranged from complex 3D modelling of machinery for remote responsive use, multi-lingual platforms, Internet of Things (IoT) integrated tools for contactless health checks and more. </p><!--kg-card-begin: html--><table>
<tr>
<th>Platform / tool</th>
<th>What is it?</th>
<th>Who uses it?</th>
 <th>Link</th> 
</tr>

  <tr>
    <td style="padding: 10px; width: 150px; text-align: left;">vSydney</td>
    <td style="text-align: left;">Walkthrough of clubs and societies for student peer connection opportunities - this is a single source of truth for list of USU clubs, their contact and registration info and social media links. This also has a machine learning driven "smart search" for info</td>
    <td style="width: 150px; text-align:left;">Any student that USU markets this information to, usage peaks at start of semester</td>
      <td style="width: 135px;"> <a href="https://vsydney.techlab.works/"> vSydney - Clubs and Societies </a> </td>  
  </tr>
    
      <tr>
        <td style="padding:10px; text-align: left;">Engineering labs 360-tour</td>
    <td style="text-align: left;">360-degree tour with annotations -- of level 1-6 of Electrical Engineering facilities, rooms, labs in bulding J03  / PNR building</td>
    <td style="text-align:left;">Any engineering student using J03 facilities and labs - over 3500 users/semester</td>
          <td><a href="https://eielabs.techlab.works/"> Engineering labs virtual tour </a> </td>  
  </tr>

      <tr>
    <td style="padding:10px; text-align: left;">OurSydney</td>
    <td style="text-align: left;">Virtual walkthrough of select highlight areas on our campus, covering aspects like space, place, history, activism, culture and community for building a sense of connectedness to the University for commencing FASS Students, with ability for students to create and share their own journeymap</td>
    <td style="text-align:left;">FASS1000 students, ~1500 a semester</td>
          <td><a href="https://oursydney.techlab.works/"> OurSydney for FASS1000 </a> </td>  
  </tr>

      <tr>
    <td style="padding:10px; text-align: left;">Wingara Mura-Bunga Barrabugu (WMBB) tour</td>
    <td style="text-align: left;"> Hosting 100-125 Aboriginal and/or Torres Strait Islander students from NSW in Years 11 and 12 each semester, this virtual tour is part of the flagship event for the Wingara Mura Bunga Barrabugu strategy, and gives students an opportunity to see the University of Sydney as their future university. This platform allows WMBB program coordinators to show some key spaces, resources and opportunities on campus to attract future students from indigenous communities and remote locations </td>
    <td style="text-align:left;">100-125 WMBB future students per semester</td>
      <td><a href="https://wmbb.techlab.works/"> WMBB campus experience </a> </td>  
  </tr>

      <tr>
    <td style="padding:10px; text-align: left;">ITAS platform</td>
    <td style="text-align: left;">Indigenous Tutor Assistance Scheme (DVC-ISS portfolio) - managed end-to-end from expression of interest from indigenous students of all faculties, data science driven algorithms to match them with suitable tutors, book and manage sessions, validate sessions, report concerns and a 'smart search' driven FAQ self-service knowledge base</td>
    <td style="text-align:left;">300-400 students, 50-200 tutors per semester across all faculties </td>
      <td><a href="https://itas.techlab.works/"> ITAS portal </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">TechLab info share (Trend Report)</td>
    <td style="text-align: left;">ICT's digital innovation team knowledge sharing and reporting tool on metrics to leadership and any interested members of University community</td>
    <td style="text-align:left;">TechLab team, ICT and broader stakeholders</td>
      <td><a href="https://trends.techlab.works/"> TechLab Trend Report </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">TechLab Reporting </td>
    <td style="text-align: left;">Feeds into Trend report (https://trends.techlab.works/) - live data on how TechLab team members spend their time, who they engage with and relevant beneficiary and impact</td>
    <td style="text-align:left;">TechLab team, ICT leadership</td>
      <td><a href="https://reporting.techlab.works/login/?next=/"> TechLab metrics reporting </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">Urban ecology online tour</td>
    <td style="text-align: left;">For Urban Ecology, Design and Planning - Sydney School of Architecture, Design and Planning: 360-degree media based tour that runs off of any modern browser or VR headset for students to explore terrains and parts of campus, replacing a completely large-group based in-person experience of 10-15km walk</td>
    <td style="text-align:left;">150-200 students per semester in ARCH9080 </td>
      <td><a href="https://urban-ecology.techlab.works/"> Urban Ecology Virtual walk </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">Epic lab tour</td>
    <td style="text-align: left;">360-degree images stitched together to allow for virtual induction into Engineering and Prototyping of Implants for Children Laboratory at Westmead</td>
    <td style="text-align:left;">Epic lab visitors (students, staff, affiliates) - variable</td>
      <td><a href="https://epic.techlab.works/"> Epic Lab induction </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">Cadigal Green site</td>
    <td style="text-align: left;">Interactive and immersive experience with learning elements and quizzes about our Darlington campus public domain now named Cadigal Green</td>
    <td style="text-align:left;">Educational designers - Faculty of Science</td>
      <td> <a href="https://cadigalgreen.techlab.works/"> Learning site -USYD Cadigal Green land </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">Komprenu</td>
    <td style="text-align: left;">MVP of Machine learning, data science, OCR and NLP incorporated indentity verification and online interview and review system with sentiment analysis to rate quality of a review based on basic subject knowledge and communication skill</td>
    <td style="text-align:left;">currently in trial use for Master of International Business test cases only</td>
      <td><a href="https://komprenu.techlab.works/"> Online interview and review system </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">Mind Your Head</td>
    <td style="text-align: left;">Brain and Mind Centre's research initiative for NSW Govt and Aus-China relations grant -- a multilingual smart web survey tool to track and provide support personalised resources for students from China </td>
    <td style="text-align:left;">in trial use by Brain and Mind Centre researchers and test cohorts of students</td>
      <td><a href="https://mindyourhead.guide"> BMRC international students' mental health check</a> </td>  
  </tr>

  <tr>
  <td style="padding:10px; text-align: left;">Eta-Ki 3D</td>
    <td style="text-align: left;">Smart asset management system for breadth of multimedia for mixed or extended reality (MR, XR) and immersive content (360 images, 360 videos, 3D models etc.)  </td>
    <td style="text-align:left;">MVP in progress to commence use Aug 2021 by Health Pathology</td>
      <td><a href="https://etaki3d.techlab.works"> XR Asset management library </a> </td>  
  </tr>

  <tr>
    <td style="padding:10px; text-align: left;">TechLab issue reporting</td>
    <td style="text-align: left;">Users of TechLab's platforms and tools that are beyond prototype stage and in use report their tech issues or bugs here, integrated to our GitHub and Teams alerts </td>
    <td style="text-align:left;">any user of TechLab-built platforms - fewer users, the better!</td>
      <td><a href="https://issue.techlab.works"> TechLab issue reporting </a> </td>  
  </tr>

</table><!--kg-card-end: html--><p><em>We don't have live links, but feel free to also ask us about these builds--</em></p><!--kg-card-begin: html--><table>
<tr>
<th>Solution, build</th>
<th>Info, context</th>
</tr>

  <tr>
    <td>AR Frog - Melbourne Zoo</td>
    <td style="text-align: left;">Augmented Reality project of a Southern Corroboree Frog (listed on the IUCN Red List as Critically Endangered) that a user can project onto any surface and interact with, as part of a Melbourne Zoo initiative to build empathy with parents and children on conservation efforts</td> 
  </tr>
    
      <tr>
        <td>Extended Reality (XR) Training, simulations</td>
    <td style="text-align: left;">From food safety training, emergency response training in factory and bottling environments such as Lion Nathan (e.g., bottle breakage, gas leaks) and hard-to-mimic real life scenarios as immersive experiences </td> 
  </tr>

      <tr>
    <td>COVID eGate</td>
    <td style="text-align: left;">Temperature sensor IoT-integrated web app survey tool that allows contact tracing and COVID symptom checks at Sydney Children's Hospital. Proof of Concept for Sydney Children's hospital -- since Dec 2020, TechLab involvement closed and handed over</td> 
  </tr>
    
    <tr>
    <td>Research driven sensitivity training </td>
    <td style="text-align: left;">Staff empathy training to interact better with LGBTIQ staff and compliance with workplace policies -- as dynamic and responsive decision-trees based on various parameters such as words and body language in interactions ...AND...

De-escalation of aggressive behaviour by workers in healthcare and for youth justice officers using Virtual Reality</td> 
  </tr>
    
    <tr>
    <td>Interactive 3D models of complex machines</td>
    <td style="text-align: left;">Virtual inductions, training and virtual responsive trials to use highly intricate, complex equipment or high-cost or sensitive machinery in chemistry, opthamology labs etc.</td> 
  </tr>
    
</table><!--kg-card-end: html-->]]></content:encoded></item><item><title><![CDATA[Serious Games and SARS-CoV 2]]></title><description><![CDATA[<p>Playing games is fun - a great way to wind down, but more than that, there is a psychological drive in humans to play challenging games. We want to know that we have mastered a situation, and we enjoy the feeling of progressing and accomplishing goals.<br> This leads to the</p>]]></description><link>https://trends.techlab.works/serious-games-and-sars-cov-2/</link><guid isPermaLink="false">5fc6c5ecc0efee000198412c</guid><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Tue, 01 Dec 2020 22:43:27 GMT</pubDate><media:content url="https://trends.techlab.works/content/images/2020/12/yul_vir_final.jpg" medium="image"/><content:encoded><![CDATA[<img src="https://trends.techlab.works/content/images/2020/12/yul_vir_final.jpg" alt="Serious Games and SARS-CoV 2"><p>Playing games is fun - a great way to wind down, but more than that, there is a psychological drive in humans to play challenging games. We want to know that we have mastered a situation, and we enjoy the feeling of progressing and accomplishing goals.<br> This leads to the unique value proposition of educational games, or the genre of <strong><a href="https://en.wikipedia.org/wiki/Serious_game" rel="noreferrer nofollow noopener">serious gaming</a></strong><a href="https://en.wikipedia.org/wiki/Serious_game" rel="noreferrer nofollow noopener">, </a>as a means to help the broader public come to grips with the world changing impact of Coronavirus. This approach has some proven success in Climate science, with high profile games such as <strong><a href="https://store.steampowered.com/app/898890/Endling/" rel="noreferrer nofollow noopener">Endling</a></strong> — communicating habitat destruction and extinction in a compelling and powerful way, with game play elements that are relatable to gamers.<br> The difference between climate change and the pandemic we find ourselves in, is that the former has 40 years of scientific consensus, often communicated effectively from scientists to laymen. SARS-CoV 2 is known to us only since about December 2019. Game developers do not yet have the same breadth of access and understanding, as the science of this pandemic has not been fully culturally assimilated yet.</p><h2 id="so-let-s-get-medical-scientists-to-design-and-build-these-games-">So, let’s get medical scientists to design and build these games.</h2><p><br>As part of the <a href="https://www.sydney.edu.au/courses/units-of-study/2020/meds/meds3888.html" rel="noreferrer nofollow noopener"><strong>MEDS3888</strong></a><strong> </strong>Cohort in semester 2, 2020, we recruited 4 game development teams. The MEDS3888 program is designed around interdisciplinary problem-solving and our proposal was simple: <em>How can you communicate the science of COVID-19 using video games?</em> <br>In 2020, The University of Sydney Medical School joined the Unity Academic alliance; this gave us the ideal launchpad. Not only does this alliance give us access to software resources, but also  a huge community of global organisations to work with to deliver educational outcomes.</p><figure class="kg-card kg-embed-card"><iframe width="612" height="344" src="https://www.youtube.com/embed/2aIlRDamNT4?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure><p>For the first 2 weeks of the semester, student teams spent time establishing a plan. We set groups up in Unity Teams and took them through our game development process.<br> Each group was taken through the 6-step process laid out by the “Ask a Game Dev” channel on YouTube. Then we (unit coordinators) setup Slack channels for each team, and brought in a design student intern 2 days a week to support debugging and polish for the teams. Through the process (which was 9 weeks) we asked teams to incorporate animation, sound design and world aesthetic every time they worked on their next component.</p><h2 id="how-do-we-get-scientists-to-be-game-developers">How do we get scientists to be game developers?</h2><p><br>Each team spent 2-4 hours per week on more complex programming challenges with our design intern. Engagement on slack where possible was quick and responsive, however, we tried to rely on open source and credible online resources to solve the questions.  For anyone planning to rapidly build a game from never having done so before, that list is here;<br></p><ol><li><a href="https://www.youtube.com/channel/UCYbK_tjZ2OrIZFBvU6CCMiA" rel="noreferrer nofollow noopener">Brackeys</a>: Even though he has left Unity development on YouTube, His tutorials are still what any educator should aspire to.</li><li><a href="https://www.youtube.com/channel/UCG08EqOAXJk_YXPDsAvReSg" rel="noreferrer nofollow noopener">Unity:</a> Second only to Brackeys is of course Unity's own channel. The tutorials here are excellent and usually the most updated to keep up with the changes to the platform.</li><li><a href="https://www.youtube.com/channel/UCuHVjteDW9tCb8QqMrtGvwQ" rel="noreferrer nofollow noopener">Thomas Brush</a>: Renowned aesthetic designer in Unity for academia and industry</li><li><a href="https://www.youtube.com/channel/UCHM37DnT_QGJT5Zyl4EmqcA" rel="noreferrer nofollow noopener">Dilmer Valecillow: </a>Adept at the technical elements of Deployments and making cross-platform tools</li><li><a href="https://www.youtube.com/channel/UCX_b3NNQN5bzExm-22-NVVg" rel="noreferrer nofollow noopener">Jason Weimann</a>:  dedicated to online tutorials helping game development</li></ol><p><br>Where content didn't exist, we created small video-based tutorials shared to every group, to be equitable. There weren’t many places where this was necessary as the Unity community is large and generous. Lastly, we encouraged groups to leverage pre-built assets as much as they could, as this was merely a 9-week development program, alongside full time study.<br></p><h2 id="enough-already-show-me-the-games-">Enough already show me the games!</h2><p><br>Gladly.<br><strong><a href="https://medgames2020.techlab.works/" rel="noreferrer nofollow noopener"> Three of these games are playable online</a></strong>, while one requires a download (please reach out to <a href="mailto:techlab@sydney.edu.au" rel="noreferrer nofollow noopener">TechLab </a>if you’d like to try ‘Mask On’).</p><figure class="kg-card kg-image-card"><img src="https://medgames2020.techlab.works/img/covoider.png" class="kg-image" alt="Serious Games and SARS-CoV 2"></figure><p><strong>Covoider</strong> by the <strong>Team Viral Gaming Dojo</strong>, is a swipe-based endless runner, where you social distance and collect PPE masks to protect yourself from the Virus. The message of the game is that a combination of approaches is necessary, as risk multiplies exponentially the more time you spend around other people.</p><p></p><figure class="kg-card kg-image-card"><img src="https://medgames2020.techlab.works/img/uwuniverse.png" class="kg-image" alt="Serious Games and SARS-CoV 2"></figure><p><strong>UwUniverse: a message of hope</strong> by <strong>Team UwU_UwU</strong> is a First-person Shooter, where you sanitise the virus and deliver masks to the infected, to undertake simple tasks like collecting groceries or visiting the gym. It also addresses feelings of isolation for the elderly, as the coronavirus ravages aged care.</p><p></p><figure class="kg-card kg-image-card"><img src="https://medgames2020.techlab.works/img/infected.png" class="kg-image" alt="Serious Games and SARS-CoV 2"></figure><p><strong>Infected</strong> by <strong>Team Crème de la Cookie </strong>is an immunology platformer where you travel inside the respiratory system of a SARS-CoV-2 sufferer to assist with their combatting the virus, by collecting the spike proteins necessary to develop antibodies.</p><figure class="kg-card kg-image-card"><img src="https://paper-attachments.dropbox.com/s_485869223A585FB6C14BBF59ABDAAD974B3EC8741EA18B64C0001467164D7A81_1606438477196_sd15projectgroup3_148038_13625498_Final+capstone+project+report_.jpg" class="kg-image" alt="Serious Games and SARS-CoV 2"></figure><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2020/12/sd15projectgroup3_148038_13625498_Final-capstone-project-report_.jpg" class="kg-image" alt="Serious Games and SARS-CoV 2" srcset="https://trends.techlab.works/content/images/size/w600/2020/12/sd15projectgroup3_148038_13625498_Final-capstone-project-report_.jpg 600w, https://trends.techlab.works/content/images/size/w812/2020/12/sd15projectgroup3_148038_13625498_Final-capstone-project-report_.jpg 812w"></figure><p><strong>Mask On</strong> is an endless runner game in an urban setting with a first-person persona, It has a free form and generative approach and requires sanitisation as a core approach to dealing with the virus. Handwashing and cleaning surfaces is the number one way this game addresses the concerns of COVID-19</p><p><br>The 3 web based games are available <strong><a href="https://medgames2020.techlab.works/" rel="noreferrer nofollow noopener">here</a></strong>. Remember these are student projects across 9 weeks, developed end-to-end, NOT by game development students, design students, or software engineers, but by scientists. <br>We believe that with 3 more weeks’ effort and some engineering know-how, each of these games could be released onto various app stores. To that end. We will likely be running more broad Med games challenges in 2021, and continuing this offering to the <strong><a href="https://www.sydney.edu.au/courses/units-of-study/2020/meds/meds3888.html" rel="noreferrer nofollow noopener">MEDS3888</a> </strong>cohort as well.<br><br><em>Banner Virus Image  created by Yuliya Stepkina as part of a MEDS3888 project using Cinema4D and protein structures from the Protein Data Bank.</em></p>]]></content:encoded></item><item><title><![CDATA[The complete dummies guide to experimentation inside the Enterprise.]]></title><description><![CDATA[<p><br>The University is a unique operating environment to say the least, but there are elements we can learn from other enterprise organisations as well as those connected to educational institutions around the world, that set a gold standard for innovation and product evolution. These insights are drawn from 8 years</p>]]></description><link>https://trends.techlab.works/dummys-guide-to-experimentation-inside-the-enterprise/</link><guid isPermaLink="false">5eed00908d6a620001a9679b</guid><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Fri, 05 Jun 2020 01:26:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1536412597336-ade7b523ecfc?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<img src="https://images.unsplash.com/photo-1536412597336-ade7b523ecfc?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="The complete dummies guide to experimentation inside the Enterprise."><p><br>The University is a unique operating environment to say the least, but there are elements we can learn from other enterprise organisations as well as those connected to educational institutions around the world, that set a gold standard for innovation and product evolution. These insights are drawn from 8 years of engagement with MIT Sloane, PWC, Gartner, Foresters and the NMC Horizon report, among many conversations with industry experts and Startups at every level of their organisational structure.</p><p><strong>Analysis takes more work than experimentation :</strong></p><p>It's important to emphasise learning from experiments and iterations on experiments over predictive analytics for innovation explorations, you can easily build 4 or 5 mock-ups or mini products with the time and money it would take to set a business analyst loose in the search for requirements. Even more challenging is the limited understanding of requirement analysis in companies that are not statistics shops. There are caveats to this, where you don't know the questions to ask, you can treat the entire predictive and prescriptive analysis approach as the experiment. This is what we did when we delivered the <a href="http://cpcdata.techlab.works/">CPC-data framework</a>.</p><p><strong>The (sort of) myth of where good ideas come from:</strong></p><p>While I am a firm believer in the coffee house hypothesis, good ideas are only half the story.</p><figure class="kg-card kg-embed-card"><iframe src="https://www.youtube.com/embed/NugRZGDbPFU?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen style="background-repeat: no-repeat; box-sizing: border-box; vertical-align: middle; position: absolute; top: 0px; left: 0px; width: 820px; height: 461.25px; max-width: 100%; margin: 0px auto;"></iframe></figure><p>As enterprises like the University become more and more data driven, we must take a leaf from our academic colleagues and make hypothesis the centre of our experimentation. This isn't about proof of concepts, this is about minimum viable products to prove or disprove a hypothesis. For example, as we currently work towards an exhibition app for a current request, we've already disregarded 3 or 4 approaches via hypothesis. We are also implementing 2 or 3 others at this very moment. One of them (or more) will stick and be proven true, and we will implement it into the delivery pipeline for the project, making the outcome incrementally better for very little work.</p><p><strong>The limits aren't the problem:</strong></p><p>Learning the maximum amount possible with specific constraints – i.e., time, money, customer segment, technical implementation, etc.  –  is  the  goal and limits and constraints are there to be challenged. Often the reason for constraints in long-standing enterprise environments is lost to the mists of time, and healthy challenge of these constraints is part of the innovation pipeline.<br><br>From the <a href="https://sloanreview.mit.edu/reports/">Sloan report in 2016</a>:</p><blockquote><em><em>"This highlights a painful insight: The biggest challenges are not technical or financial, but cultural and organizational. At most firms, management overwhelmingly favors planning, programs, projects, and pilots over the real-world benefits of experimental knowledge and insight. Most don’t realize how exponential economics of experimentation can bolster their innovation investment portfolios."</em></em></blockquote><blockquote><em><em>"Executives frequently resist easy opportunities to cost-effectively experiment because they fear challenges to their hard-won professional intuitions and authority. Data-driven digital experiments might undermine pet hypotheses or business perspectives. But preserving, protecting, and defending the status quo may prove even more costly."</em></em></blockquote><p><strong>Make experiments social:</strong></p><p>ICT don’t have  to own all experiments. Engaging diverse portfolios in the process, through Scrum, product pipeline engagement or even a chat or dialogues  around innovation, inviting comments and critiques, is a path to more holistic understanding of what needs to change. Broadly socialising results with vendors, community, students and more provides both much needed transparency and a feedback loop that adds value rather than criticism.</p><p><strong>Prioritisation is hard:</strong></p><p>I don't think I can overstate this. No matter what mechanism you use for prioritisation it must marry to organisational strategy. For the University there are 3 pillars, Research, Education and Culture, and the sub strategies under that give us broad brushes with which to paint our experiments. The actual challenge is not prioritising by strategic alignment it is more about balancing tensions that typically emerge between customer-facing managers and  technical managers about what to test first. Our Lab is unique in that we are both technical and customer facing, and we have been completely transparent about our experiment choices. You can see exactly the types of initiatives we engage in <a href="https://trends.techlab.works/reporting/">here</a>, as can every stakeholder.</p><p><strong>Perfection is the enemy of Done:</strong></p><p>The Pareto Principle or the 80/20 Principle tells us we can't solve 100% of problems for 100% of users. In a product driven world, perfectionist engineering is suppressed in  favour of quicker, iterative sensibilities. You solve for 80% of people in the first instance. Outliers can be addressed with time and engagement, but if 80% of your challenge can be solved now, why aren't you doing it?</p><figure class="kg-card kg-bookmark-card"><a class="kg-bookmark-container" href="https://betterexplained.com/articles/understanding-the-pareto-principle-the-8020-rule/"><div class="kg-bookmark-content"><div class="kg-bookmark-title">Understanding the Pareto Principle (The 80/20 Rule) – BetterExplained</div><div class="kg-bookmark-description"></div><div class="kg-bookmark-metadata"><span class="kg-bookmark-publisher">BetterExplained</span></div></div><div class="kg-bookmark-thumbnail"><img src="https://betterexplained.com/wp-content/uploads/pareto/pareto_graph.png" alt="The complete dummies guide to experimentation inside the Enterprise."></div></a></figure><p>A focus on core outcomes as opposed to solving everything for everyone. If you manage expectations, people will understand. The University has a bias for "everyone's voice must be heard and acted on" this is in juxtaposition of the organisations capability and funding to deliver that level of outcome. Pivoting to "everyone's voice must be heard" is a better strategy for delivery of innovation, and even waterfall project engagement.</p><p><strong>Measure trajectory not just outcomes:</strong></p><p>Look, outcomes are important, but a shift in focus organisationally because of experiments with no outcome is almost as important. Helping to align the organisation to innovative and agile practices is as much an outcome as any tool or product delivered from an innovation portfolio.</p><p><strong>Be Ethical:</strong></p><p>'Don't Be Evil' and 'Do The Right Thing' are Alphabet Co and Googles Mottos and form part of their code of conduct. Set boundaries, and understand what data can mean. Engage with Cybersecurity specialists in design, and privacy experts in execution.</p>]]></content:encoded></item><item><title><![CDATA[Student Innovation Challenge 2020]]></title><description><![CDATA[27 July 2020, 10:00AM|University of Sydney]]></description><link>https://trends.techlab.works/student-innovation-challenge-2020/</link><guid isPermaLink="false">5eed00908d6a620001a9679d</guid><dc:creator><![CDATA[TechLab]]></dc:creator><pubDate>Fri, 22 May 2020 01:36:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1574974560650-985fb5c87e3c?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<blockquote><em>In a virtual expo-style event, students pitch their start-ups, interdisciplinary projects and innovative research</em></blockquote><figure class="kg-card kg-embed-card"><iframe src="https://www.youtube.com/embed/R9kcCssTfw8?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen style="background-repeat: no-repeat; box-sizing: border-box; vertical-align: middle; position: absolute; top: 0px; left: 0px; width: 820px; height: 461.25px; max-width: 100%; margin: 0px auto;"></iframe></figure><img src="https://images.unsplash.com/photo-1574974560650-985fb5c87e3c?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Student Innovation Challenge 2020"><p>The <a href="https://www.sydney.edu.au/engage/events-sponsorships/innovation-week/student-challenge.html">Student Innovation Challenge</a> is a fantastic opportunity for University of Sydney students to present their innovative solutions to real-world problems. There are over $28,000 worth of prizes that will be won!</p><p><a href="https://www.sydney.edu.au/news-opinion/news/2019/08/23/student-entrepreneurs-pitch-their-bright-ideas.html">Last year’s winners</a> included a proactive management tool to combat UV risk in the workplace, rocketry research in the aerospace sector, and a digital tool that empowers underprivileged farmers use fertiliser effectively.</p><h2 id="2020-categories-and-prizes"><strong>2020 categories and prizes</strong></h2><h3 id="start-up-innovation-prize-10-000"><strong>Start-up Innovation Prize - $10,000</strong></h3><p>Is your idea ready to launch? The Start-up Innovation Prize is for individual students, or teams of up to 5, who have developed a new idea with commercial potential. This competition is best suited for students who already have made demonstrable progress.</p><p>In addition to the cash prize, the winner will also receive a three month membership<strong><strong> </strong></strong>at the <a href="https://www.sydney.edu.au/engage/industry-business-partnerships/sydney-knowledge-hub.html">Sydney Knowledge Hub</a> the University's coworking space for innovative businesses seeking to collaborate with the University. Membership includes access to:</p><ul><li>flexible workspace</li><li>the Knowledge Hub community</li><li>the programming and events held onsite</li><li>facilitation to find and work with the brightest minds and other resources of the University to get your business to the next level.</li></ul><h3 id="research-innovation-prize-5-000"><strong>Research Innovation Prize - $5,000</strong></h3><p>From theory to practice! This prize is for individual HDR students or teams of up to 5 who have discovered and developed a practical application for their research. This competition is best suited to students who demonstrate that their research has led to this outcome.</p><p>In addition to the cash prize, the winner will also receive a three month membership<strong><strong> </strong></strong>at the <a href="https://www.sydney.edu.au/engage/industry-business-partnerships/sydney-knowledge-hub.html">Sydney Knowledge Hub</a> the University's coworking space for innovative businesses seeking to collaborate with the University. Membership includes access to:</p><ul><li>flexible workspace</li><li>the Knowledge Hub community</li><li>the programming and events held onsite</li><li>facilitation to find and work with the brightest minds and other resources of the University to get your business to the next level.</li></ul><h3 id="interdisciplinary-innovation-prize-5-000"><strong>Interdisciplinary Innovation Prize - $5,000</strong></h3><p>Two heads are better than one! Teams of 2-5 students working together from different disciplines can enter their solutions to real-world problems. The competition is best suited to students who illustrate how their disciplinary expertise have combined to produce a better idea.</p><h3 id="people-s-choice-prize-1-000-x-3"><strong>People’s Choice Prize - $1,000 x 3</strong></h3><p>Shortlisted teams in each of these categories will have the chance to win the People’s Choice Prize. The idea with the most votes in their respective category will win $1,000. Make sure you spread the word to be in the best spot to win! Voting opens on the <strong><strong>27 July</strong></strong> and closes on <strong><strong>12 August</strong></strong>. Check back here on the <strong><strong>27 July</strong></strong> for details.</p><h2 id="how-to-apply"><strong>How to Apply</strong></h2><p><a href="https://www.judgify.me/studentinnovationchallenge2020">Fill out the online application form.</a></p><p>The application for any innovation award should be uploaded as a 2-page pitch document in PDF format with an accompanying video or animated presentation (3 minute maximum). It should clearly explain the problem or opportunity on which the team has focused, and the solution that has been developed. It should present a strong case for the idea and how it fits the competition objective. Future targets and goals should be explained, as well as any traction the team has so far (e.g. prototype, funding, sales and partnerships).</p><p>Shortlisted teams will be announced on the 27 July 2020 and invited to present a virtual pitch to the Showcase Judging Panel on 12 August 2020. The winners for each category and the People’s Choice Prize will be announced on 26 August 2020.</p><h2 id="key-dates"><strong>Key dates</strong></h2><ul><li>Applications open: 18 May 2020</li><li>Applications close: 25 June 2020</li><li>Shortlisted teams announced: 27 July 2020</li><li>Showcase Judging Panel: 12 August 2020</li><li>Winners announced: 26 August 2020.</li></ul><h2 id="judging-criteria"><strong><a href="https://www.sydney.edu.au/content/dam/corporate/documents/engage/events-sponsorships/student-innovation-challenge-judging-criteria_2020.pdf">Judging Criteria</a></strong><br></h2><h2 id="data-privacy"><strong>Data Privacy</strong></h2><p>Students should be aware that shortlisted applications will be publicly visible and should not include sensitive or confidential material. The application process, judging process and People’s Choice Award voting will be managed through the online platform <a href="https://www.judgify.me/l/privacy-policy">Judgify</a>. Judgify is not owned by the University. Students should review the <a href="https://www.judgify.me/l/privacy-policy">Judgify privacy policy</a> before submission. Students should keep copies of any material uploaded to Judgify for their own records.</p>]]></content:encoded></item><item><title><![CDATA[Innovating from your living room]]></title><description><![CDATA[<blockquote>For many weeks since March 2020, owing to the rapid onset of the COVID-19 pandemic, social isolation has become the norm. The Australian and NSW governments have implemented stringent controls and essentially restricted residents to their homes except for groceries, essential services and constitutionals. Collaboration and continued innovation in these</blockquote>]]></description><link>https://trends.techlab.works/innovating-from-your-living-room/</link><guid isPermaLink="false">5eed00908d6a620001a9679c</guid><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Thu, 07 May 2020 01:33:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1491336238524-c990bd671778?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<blockquote>For many weeks since March 2020, owing to the rapid onset of the COVID-19 pandemic, social isolation has become the norm. The Australian and NSW governments have implemented stringent controls and essentially restricted residents to their homes except for groceries, essential services and constitutionals. Collaboration and continued innovation in these changing circumstances has been imperative.So far, we’ve undertaken coding sessions, VR development, meetings via phone, meetings via Zoom, lots and lots of communicating via instant message and have even taught and learned in fully immersive Virtual Reality. here are some key takeaways.</blockquote><h2 id="working-from-home-powered-by-technology-equipment-to-make-your-life-easier-"><strong><strong>Working from home, powered by technology - equipment to make your life easier.</strong></strong></h2><img src="https://images.unsplash.com/photo-1491336238524-c990bd671778?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Innovating from your living room"><p>Some basic equipment has been crucial to our business as usual approach.</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2020/06/Screen-Shot-2020-06-19-at-3.51.49-pm.png" class="kg-image" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/Screen-Shot-2020-06-19-at-3.51.49-pm.png 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/Screen-Shot-2020-06-19-at-3.51.49-pm.png 1000w, https://trends.techlab.works/content/images/size/w1200/2020/06/Screen-Shot-2020-06-19-at-3.51.49-pm.png 1200w"><figcaption>A Home Zoom and podcast booth, consisting of a USB microphone, high quality webcam with light ring, and a room divider from catch of the day.</figcaption></figure><h3 id="audio-equipment-"><strong><strong>Audio Equipment:</strong></strong></h3><ul><li>Headset for audio only: <a href="https://www.bhphotovideo.com/c/product/1400991-REG/plantronics_207576_01_blackwire_5200_series_usb.html" rel="noreferrer nofollow noopener">Plantronics Black Wire 5220</a></li><li>Wireless Microphone for audio capture: <a href="https://www.amazon.com/Samson-Wireless-Microphone-System-SWXPD2BLM8/dp/B07HPRZBKC/" rel="noreferrer nofollow noopener">Samson USB Wireless Microphone</a></li><li>Desktop Microphones: keep your eyes out for when these come back into stock, they are a game changer: <a href="https://www.amazon.com.au/Neewer-Microphone-Suspension-Broadcasting-Recording/dp/B07DK89QZS/ref=sr_1_6?keywords=neewer&amp;qid=1588556017&amp;sr=8-6" rel="noreferrer nofollow noopener">Newer W-7000 USB microphone on arm</a></li></ul><h3 id="video-equipment-"><strong><strong>Video Equipment:</strong></strong></h3><ul><li>Webcam for audio and video: <a href="https://www.logitech.com/en-us/product/c930e-webcam" rel="noreferrer nofollow noopener">Logitech C930e</a> or <a href="https://www.logitech.com/en-us/product/brio?crid=34" rel="noreferrer nofollow noopener">Logitech Brio 4K</a></li><li>Webcam for video: <a href="https://www.logitech.com/en-us/product/c930e-webcam" rel="noreferrer nofollow noopener">Logitech C930e</a> or <a href="https://www.logitech.com/en-us/product/brio" rel="noreferrer nofollow noopener">Logitech Brio 4K</a> or <a href="https://www.amazon.com.au/Razer-RZ19-02320100-R3U1-Kiyo-Streaming-Built/dp/B075N1BYWB">Razer Kiyo</a></li><li>On most computers you can adapt your Digital SLR to work as a webcam</li><li>Tripod: <a href="https://www.amazon.com/Victiv-Camera-Aluminum-Monopod-72-inch/dp/B07JCG1BKY/" rel="noreferrer nofollow noopener">Victiv 72-inch tripod</a> or any tripod with a standard 1/4 inch screw mount will work, even a <a href="https://www.amazon.com/GorillaPod-Compact-Ballhead-Mirrorless-Charcoal/dp/B074WC9YKL/" rel="noreferrer nofollow noopener">Joby Gorilla pod 3K</a> can help with framing</li><li>In a pinch <a href="https://www.kinoni.com/">EpocCam software</a> can turn your Iphone or Android Phone, into a webcam</li></ul><h3 id="document-broadcasting-"><strong><strong>Document Broadcasting:</strong></strong></h3><ul><li>If you need to share written notes, or show overhead capture, you may need a <a href="https://www.amazon.com/Ipevo-5-883-4-01-00-VZ-R-Document-Camera/dp/B0784RZNKT/" rel="noreferrer nofollow noopener">IPEVO VZ-R Document Camera</a> which has a built in microphone.</li><li>Second screens (even small ones) will make digital collaboration a lot easier.</li></ul><h3 id="keyboard-and-mouse-"><strong><strong>Keyboard and Mouse:</strong></strong></h3><ul><li>No specific recommendations, but a lighter weight mouse will help you avoid WHS related injuries.</li></ul><h3 id="phones-"><strong><strong>Phones:</strong></strong></h3><ul><li>Don't forget to redirect on campus phone lines, for some reason people still call them!</li><li>The Service Now knowledge base has a lot of details on how to interact with your desk phone from home.</li></ul><h3 id="files-and-storage-"><strong><strong>Files and Storage:</strong></strong></h3><ul><li>Dropbox, Sharepoint and OneDrive are all good options, but if you find you and your team collaborate on a lot of documents, nothing beats <a href="https://paper.dropbox.com/">DropBox Paper</a>. The university has a full license.</li></ul><h2 id="ar-vr-xr-initiatives-in-the-time-of-self-isolation-"><strong><strong>AR/VR/XR initiatives in the time of self-isolation.</strong></strong></h2><p>More Virtual reality headsets sold in the months of March and April than any time since consumer headsets were made available. This has been a combination of 2 key elements. The first obvious one is that with more people unable to leave the house, escapism to virtual worlds has become exceedingly appealing. The second is the release of <a href="https://store.steampowered.com/app/546560/HalfLife_Alyx/" rel="noreferrer nofollow noopener">Half-Life: Alyx.</a> the first truly AAA game developed exclusively for VR. Estimates show that the game has sold almost a million copies, which exceeds most regular games, and is far and away the most downloaded VR game to date.In the Education side of things, we’ve been hard at work developing an Asset management tool for delivering 360 tours and 3-D model sharing for staff and students.  We call it <u>Eta-Ki. </u>The translation of “Eta-Ki?” from Bengali would be “what is this?”‌‌</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2020/06/etaki-copy.png" class="kg-image" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/etaki-copy.png 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/etaki-copy.png 1000w"><figcaption>A space shuttle with some circumspect annotations in Eta-Ki</figcaption></figure><p>‌‌‌This has helped us to consolidate an approach for media sharing between academics and students, and we hope will lead to more use in the future. It’s comforting that trends we have been tracking and prototyping with for the past seven years are now in very high demand for teaching and research. Headsets are still difficult, as we have only around 100 and they are all trapped on campus, but the price continues to drop and if Half Life: Alyx is any indication, there will be a lot in homes around the world for us to leveragewith our teaching content in the future.</p><h2 id="the-tech-demo-from-home-"><strong><strong>The Tech Demo from Home.</strong></strong></h2><p>There is a lot of equipment that we regularly demonstrate for educators and researchers, be it 3D printers, headsets, wearables, microcomputers or cameras. This has led to a lot of screencasts and video resource creation. It has also led to a lot of equipment in our homes! Each time one of us ducks to the lab, we seem to bring one more piece of equipment home so we can stream a demonstration for somebody, and It’s clogging up our work area – which we cannot recommend. Setting limits on what is possible is important here, but in theory, there is no limit to what can be demonstrated via a 360 video or even a High definition standard format recording.‌‌‌‌ a personal reflection from running the ANZAC day memorial live stream; it was eerie with four people at an event that usually has closer to a thousand attendees;  All of the tech demo was done from my balcony over the phone and over <a href="https://www.youtube.com/watch?v=QgpXSiPu-Kk" rel="noreferrer nofollow noopener">YouTube</a>. You can even hear me chatting to my cat.‌‌</p><h2 id="we-love-whiteboard-"><strong><strong>We Love Whiteboard.</strong></strong></h2><p>Anyone who has a background in design will know the power of the whiteboard. There’s nothing so invigorating as prioritising requirements in a scrum on a large free-form canvas.  While there are plenty of digital white-boarding tools, mind-mapping tools and a myriad of multi-user editing tools available, this is a situation where we advocate the lo-fi solution.‌‌Write down the requirements (digitally or analog) and share it back to the group through a simple screen share. We’ve been using Dropbox paper in the first instance. But we’ve also been up-skilling our ‘customers’ in issue management.  Bringing them into the mindset of a milestone-driven developer makes their contribution participatory and leads to better results. Quite often now we wake to an issue in GitHub created by a user, that is articulated in a way our team can understand and action.</p><h2 id="virtually-available-"><strong><strong>Virtually Available.</strong></strong></h2><p>While Virtual Labs are nothing new, the level of Fidelity we can implement using Unity, WebGL and other modern delivery modalities means we can generate a lot of immersive and virtual content “Imagine what the <a href="https://phet.colorado.edu/en/simulations/category/by-level/university" rel="noreferrer nofollow noopener">University of Colorado</a> do but on steroids.  A Disclaimer, I have a vested interest here developing pharmacological laboratories in VR in my outside of Sydney University role, with that said, I still see the value in the university adopting a broader approach. many skills can be on-boarded, audited and even certified via Virtual instruments.” Without getting too far into the pedagogy and without boring you with details of Blooms taxonomy, scaffolding, or productive failure, suffice to say the supporting literature for Virtual, Simulation and Immersion as powerful learning tools is growing every day. <a href="https://docs.google.com/document/d/1h2jDltrlwHsau4VOsN0U1bX4BSqC7E_mD0ATC3O2hfQ/edit#heading=h.6caaez79xke" rel="noreferrer nofollow noopener">iDesign provides a great resource</a><u> </u>with practical examples but we can develop even higher fidelity solutions which, research says, increase the effectiveness of learning. Things like lab safety, complex and expensive instruments, and difficult locations are the obvious choices, but we have simulated mitochondria as big as buildings, and planets as small as marbles.‌‌‌‌</p><h2 id="up-skilling-always-"><strong><strong>Up-skilling always.</strong></strong></h2><p>Isolation is a great time to work on capability for the future. some members of ICT TechLab team have been engaged in the Unity Learning system, while we doubt the need to suggest to this audience that upskilling your team is important, we understand that austerity measures make budgets difficult. with that in mind, here are a list of free resources we have been engaging with to keep our team at the leading edge.</p><ul><li>The State government of NSW has made <a href="https://www.nsw.gov.au/news/free-short-courses-to-support-nsw" rel="noreferrer nofollow noopener">various short courses free</a></li><li><a href="https://blog.udacity.com/2020/03/one-month-free-on-nanodegrees.html" rel="noreferrer nofollow noopener">Udacity</a> are offering free nanodegrees. (A concept Universities may be quick to investigate following this pandemic)</li><li>All University staff have access to <a href="https://www.linkedin.com/learning" rel="noreferrer nofollow noopener">LinkedIn learning</a> which has a fairly broad number of short courses.</li><li><a href="https://generalassemb.ly/free-online-learning" rel="noreferrer nofollow noopener">General Assembly</a> are running some of their boot-camps for free online</li><li><a href="https://www.ribit.net/" rel="noreferrer nofollow noopener">Ribit</a> is a digital platform that connects tertiary students to part-time paid employment, study related projects, and course accredited placements that focus on digital and STEM skills relevant to tertiary students’ study and future career aspirations. Ribit is Airtasker for student jobs.</li></ul><h2 id="health-and-well-being-be-inclusive-and-flexible-"><strong><strong>Health and well-being – be inclusive and flexible.</strong></strong></h2><p>Not going to imply we are all-rounders/experts in this area and still improving daily, but physical activity is something we all need, and mental health is more important than ever. There is a myriad of resources available to help manage your mental health while in isolation, and we recommend trial/using all of them. One thing we can say that has worked – is taking time to video chat with friends and colleagues after-hours about non-work things, and about work things. We used to vent Fridays at the pub or to our friends at dinner, now we need a way to do that.  For some of us venting is healthy. ‌‌Another unseen benefit of the startup culture we run in the TechLab is topping and tailing each day with a positive interaction among the team.  For most of the team, their first meeting of the day will be with their teammates to talk about the agenda for the day’s work, and the last meeting of the day will be to discuss anything that got missed, arising business and challenges to overcome. There’s something to be said for starting and ending the day with a group of people who are genuinely interested in each other’s wellbeing, and them making it known. All our team calls end with “stay safe” and it makes it feel like every day is RUOK day.</p><p>For Physical fitness in your lounge room, nothing beats <a href="https://www.reddit.com/user/GovSchwarzenegger/comments/flz3es/stay_at_home_stay_fit/">this work</a> out routine, recently published on reddit:</p><figure class="kg-card kg-gallery-card kg-width-wide"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/5ctNfUO.jpg" width="997" height="743" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/5ctNfUO.jpg 600w, https://trends.techlab.works/content/images/size/w997/2020/06/5ctNfUO.jpg 997w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/1SbhXSf.jpg" width="1001" height="861" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/1SbhXSf.jpg 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/1SbhXSf.jpg 1000w, https://trends.techlab.works/content/images/size/w1001/2020/06/1SbhXSf.jpg 1001w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/ZMYdgUv.jpg" width="1093" height="775" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/ZMYdgUv.jpg 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/ZMYdgUv.jpg 1000w, https://trends.techlab.works/content/images/size/w1093/2020/06/ZMYdgUv.jpg 1093w"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/NpD1Cqb.jpg" width="911" height="973" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/NpD1Cqb.jpg 600w, https://trends.techlab.works/content/images/size/w911/2020/06/NpD1Cqb.jpg 911w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/wHsbAMM.jpg" width="767" height="1084" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/wHsbAMM.jpg 600w, https://trends.techlab.works/content/images/size/w767/2020/06/wHsbAMM.jpg 767w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/HiTTLSo.jpg" width="994" height="901" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/HiTTLSo.jpg 600w, https://trends.techlab.works/content/images/size/w994/2020/06/HiTTLSo.jpg 994w"></div></div><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/o8HCpoR.jpg" width="1095" height="784" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/o8HCpoR.jpg 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/o8HCpoR.jpg 1000w, https://trends.techlab.works/content/images/size/w1095/2020/06/o8HCpoR.jpg 1095w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/lDdYAuI.jpg" width="1001" height="689" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/lDdYAuI.jpg 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/lDdYAuI.jpg 1000w, https://trends.techlab.works/content/images/size/w1001/2020/06/lDdYAuI.jpg 1001w"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/2kxr1Nf.jpg" width="847" height="1198" alt="Innovating from your living room" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/2kxr1Nf.jpg 600w, https://trends.techlab.works/content/images/size/w847/2020/06/2kxr1Nf.jpg 847w"></div></div></div></figure><h2 id="conclusions-and-musing-"><strong><strong>Conclusions and musing.</strong></strong></h2><p>We may be through this in a few more weeks or months here in Australia. With the slow meander back to normalcy, or perhaps there is a relapse and we end up in this state for the rest of the year. It’s impossible to say. One thing however can be certain. when we do return to the Lab, we will be taking many of these digital practices with us.</p>]]></content:encoded></item><item><title><![CDATA[Tacotron-2 Audio Synthesis]]></title><description><![CDATA[<h2 id="overview"><strong><strong>Overview</strong></strong></h2><p>When people talk or sing,  different muscles are being used, including some in the month and throat. Just like other muscles in human body, overuse of the ones that help human speak can lead to fatigue, strain and injury.</p><p>In Feb 2018. Google team published a paper, <a href="https://arxiv.org/pdf/1712.05884.pdf">Natural TTS</a></p>]]></description><link>https://trends.techlab.works/tacotron/</link><guid isPermaLink="false">5eed0d718d6a620001a96803</guid><dc:creator><![CDATA[Lydia Gu]]></dc:creator><pubDate>Sat, 28 Mar 2020 19:11:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1457689146074-bd667e343a9c?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<h2 id="overview"><strong><strong>Overview</strong></strong></h2><img src="https://images.unsplash.com/photo-1457689146074-bd667e343a9c?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Tacotron-2 Audio Synthesis"><p>When people talk or sing,  different muscles are being used, including some in the month and throat. Just like other muscles in human body, overuse of the ones that help human speak can lead to fatigue, strain and injury.</p><p>In Feb 2018. Google team published a paper, <a href="https://arxiv.org/pdf/1712.05884.pdf">Natural TTS Synthesis by Conditioning WaveNet on Mel Spectrogram</a>, where they presented a neural text-to-speech model that learns to synthesise speech directly from (text, audio) pairs.</p><h2 id="system-setup"><strong><strong>System setup</strong></strong></h2><pre><code class="language-bash">git clone https://github.sydney.edu.au/TechLab/tacotron.git
</code></pre><ol><li>Build docker image</li></ol><pre><code class="language-bash">docker build -t nginx/tacotron2 .
</code></pre><p>In <code>/tacotron</code></p><ol><li>Run the built docker image (tacotron/tacotron2)</li></ol><pre><code class="language-bash">docker run --gpus all -it -p 8888:8888 nginx/tacotron2
cd tacotron2/

git submodule init; git submodule update --remote --merge

python waveglow/convert_model.py waveglow_256channels.pt waveglow_256channels_new.pt

jupyter notebook --ip 0.0.0.0 --no-browser --allow-root &amp;


</code></pre><p><code>sed -i -- 's,DUMMY,LJSpeech-1.1/wavs,g' filelists/*.txt</code></p><h2 id="experiments"><strong><strong>Experiments</strong></strong></h2><p>The solution from TechLab team is using the Tacotron 2 based on the <a href="https://github.com/NVIDIA/tacotron2">Nvidia pytorch implementation</a> of paper <a href="https://arxiv.org/pdf/1712.05884.pdf">Natural TTS Synthesis By Conditioning Wavenet On Mel Spectrogram Predictions</a> (J. Shen, et al.)</p><h3 id="a-deep-dive-on-the-audio-with-librosa"><strong><strong>A deep dive on the audio with LibROSA</strong></strong></h3><h4 id="install-libraries"><br><strong><strong>Install libraries</strong></strong></h4><p>Firstly, let's install and import libraries such as <code>librosa</code>, <code>matplotlib</code> and <code>numpy</code>.</p><pre><code class="language-python">import librosa
import librosa.display
import matplotlib.pyplot as plt
import numpy as np</code></pre><h4 id="loading-in-an-audio-file-and-plot-the-wave"><strong><strong>Loading in an audio file and plot the wave</strong></strong></h4><pre><code class="language-python"># Load audio file
filename = 'output/chunk2.mp3'
y, sr = librosa.load(filename)

# Trim silent edges
speech, _ = librosa.effects.trim(y)

# Plot the wave
librosa.display.waveplot(speech, sr=sr)</code></pre><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2020/05/tacotron_waveplot.png" class="kg-image" alt="Tacotron-2 Audio Synthesis"></figure><h4 id="plot-the-mel-spectrogram"><strong><strong>Plot the Mel spectrogram</strong></strong></h4><pre><code class="language-python"># Mel spectrogram
S = librosa.feature.melspectrogram(y=y, sr=sr, n_mels=128, fmax=8000)

plt.figure(figsize=(10, 4))
S_dB = librosa.power_to_db(S, ref=np.max)
librosa.display.specshow(S_dB, x_axis='time', y_axis='mel', sr=sr)  #default is fmax=sr/2
plt.colorbar(format='%+2.0f dB')
plt.title('Mel-frequency spectrogram')
plt.tight_layout()
plt.show()</code></pre><figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2020/05/tacotron_melspectrogram.png" class="kg-image" alt="Tacotron-2 Audio Synthesis"></figure><h3 id="transfer-learning-using-a-pre-trained-model"><strong><strong>Transfer Learning using a pre-trained model</strong></strong></h3><h4 id="background"><strong><strong>Background</strong></strong></h4><ul><li>Tacotron 2 is one of the most successful sequence-to-sequence models for text-to-speech, at the time of publication.</li><li>The experiments delivered by TechLab</li><li>Since we got a audio file of around 30 mins, the datasets we could derived from it was small. The appropriate  approach for this case is to start from the pre-trained Tacotron model (published by NVidia) which made use of the <a href="https://keithito.com/LJ-Speech-Dataset/">LJ Speech dataset</a> for the training, and then fit to our small dataset.<br></li></ul><p>In our experiments, there are several things to note:<br>-  Sampling rate differences: the transfer learning did not manage to work well if the sampling rate of custom audio is different from that of LJ Speech dataset. We should always convert the sampling rate of our own dataset to be identical with the sampling rate of dataset used in the pre-trained model.<br>- If we put a larger number of dropout, say 0.4-0.5, a surge of MEM occupation would happen and the mini-batch training would be stopped after several epochs. There are discussions on similar scenarios on GitHub and other platforms e.g. StackOverflow. People are prone to reckon it's due to the PyTouch architecture. The real cause of this scenario is still dim.<br>- A smaller batch size (e.g. 8 or 16) would lead to severe overfit.</p><h4 id="issues"><strong><strong>Issues</strong></strong></h4><p>However, the problem we were facing is the severe overfitting caused by the size of the dataset. Generally we can use some techniques to reduce the overfitting and do a better convergence:<br></p><ul><li>Enlarge the dataset by getting more data from Jason;<br></li><li>Enlarge the dataset by using augmentation techniques on the audio sample provided (ref. to “Data Augmentation for Audio” below)<br></li><li>Other machine learning/deep learning techniques, e.g. regularization, dropout, early stopping, and bigger batch size in per epoch (TechLab team has some limitation of using all the techniques listed here due to the GPU capabilities we have)<br></li></ul><h2 id="future-work"><strong><strong>Future work</strong></strong></h2><p>Data Augmentation for Audio</p><ul><li>Noise injection - add some random value into data (may help reduce overfitting)</li><li>Shifting time - shift audio to left/right with a random second (our team had implemented this method to create more data samples)</li><li>Changing pitch</li><li>Changing speed<br></li></ul><p><br></p><ul><li>The source code: <a href="https://github.sydney.edu.au/TechLab/tacotron">https://github.sydney.edu.au/TechLab/tacotron</a></li></ul><p>Reference:<br>- <a href="https://www.webmd.com/rheumatoid-arthritis/why-am-i-losing-my-voice">https://www.webmd.com/rheumatoid-arthritis/why-am-i-losing-my-voice</a></p>]]></content:encoded></item><item><title><![CDATA[LIVE: How does TechLab spend its time?]]></title><description><![CDATA[<blockquote><em>Notes:</em><br><em>• All visualisations are connected to live data collected from internal reporting.</em><br><em>• Graphs are designed for light mode <u>only</u>; toggle using the top-right circle button.</em></blockquote><hr><p>Since April 1, 2019 TechLab members have been reporting on how they spend their time in the lab. This report answers different questions, such as—</p>]]></description><link>https://trends.techlab.works/reporting/</link><guid isPermaLink="false">5eed0e188d6a620001a9680e</guid><dc:creator><![CDATA[John Jota]]></dc:creator><pubDate>Tue, 17 Mar 2020 19:13:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1506784365847-bbad939e9335?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<blockquote><em>Notes:</em><br><em>• All visualisations are connected to live data collected from internal reporting.</em><br><em>• Graphs are designed for light mode <u>only</u>; toggle using the top-right circle button.</em></blockquote><hr><img src="https://images.unsplash.com/photo-1506784365847-bbad939e9335?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="LIVE: How does TechLab spend its time?"><p>Since April 1, 2019 TechLab members have been reporting on how they spend their time in the lab. This report answers different questions, such as—which initiatives has the team been working on? Is it a meeting or a development work? Which faculty or unit are they engaged with? Over-all, this live reporting has now recorded:</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe title="TechLab Tally" aria-label="Table" src="https://datawrapper.dwcdn.net/ssnu9/6/" scrolling="no" style="background-repeat: no-repeat; box-sizing: border-box; vertical-align: middle; max-width: 100%; border: none; height: 170px;" width="804" height="169" frameborder="0"></iframe><figcaption>Total Statistics</figcaption></figure><p>Each initiative handled by TechLab is assessed with an <strong>impact score from 1 to 50</strong>. This allows the assessment of how the lab is allocating their time and maximising the benefits they could deliver with the limited resources.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe title="TechLab Initiatives" aria-label="Table" src="https://datawrapper.dwcdn.net/rf8DU/5/" scrolling="no" style="background-repeat: no-repeat; box-sizing: border-box; vertical-align: middle; max-width: 100%; border: none; height: 538px;" width="795" height="586" frameborder="0"></iframe><figcaption>List of Initiatives worked on by TechLab</figcaption></figure><hr><p>The profile of hours reported by the Techlab team vary month per month. Multiple factors contribute to this including, but not limited to, semester activities, team headcount, and/or scheduled university shutdowns. With the varying schedule, initiatives are still prioritised appropriately; a comparison of monthly average impact and hours spent can be seen below.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe title="TechLab's Hours and Impact Comparison" aria-label="Column Chart" src="https://datawrapper.dwcdn.net/Wo22l/1/" scrolling="no" style="background-repeat: no-repeat; box-sizing: border-box; vertical-align: middle; max-width: 100%; border: none; height: 363px;" width="600" height="370" frameborder="0"></iframe><figcaption>Hours vs Impact (click the tabs to see the difference).</figcaption></figure><hr><p>As much as TechLab is composed of people with varying specialisation and expertise, its members also spend their time in varying types of activities and wearing different hats at work.</p><figure class="kg-card kg-embed-card"><iframe title="Breakdown of TechLab's Activities" aria-label="Interactive donut chart" src="https://datawrapper.dwcdn.net/7X6uS/10/" scrolling="no" style="background-repeat: no-repeat; box-sizing: border-box; vertical-align: middle; max-width: 100%; border: none; height: 491px;" width="571" height="490" frameborder="0"></iframe></figure>]]></content:encoded></item><item><title><![CDATA[Additive Manufacturing]]></title><description><![CDATA[<h1></h1><p>We live in a world where Corn and Sugar Cane has never been more important. I’m not talking about food consumption.</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/576px-CSIRO_ScienceImage_3020_Starch_to_Polymer.jpg" width="576" height="720" alt></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/8611895213_d3e210a799_c.jpg" width="800" height="600" alt srcset="https://trends.techlab.works/content/images/size/w600/2020/06/8611895213_d3e210a799_c.jpg 600w, https://trends.techlab.works/content/images/size/w800/2020/06/8611895213_d3e210a799_c.jpg 800w"></div></div></div><figcaption>L - Sugar Cane Corn Starch(CSIRO); R - 3D printed PLA teapot(Creative Tools)</figcaption></figure><p><br>Looking back to my high school days in 2017, I took Chemistry for</p>]]></description><link>https://trends.techlab.works/additive-manufacturing/</link><guid isPermaLink="false">5eed00908d6a620001a96799</guid><dc:creator><![CDATA[John Antonios]]></dc:creator><pubDate>Tue, 10 Mar 2020 00:32:30 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1582879304171-8041c73bedbd?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<h1></h1><img src="https://images.unsplash.com/photo-1582879304171-8041c73bedbd?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="Additive Manufacturing"><p>We live in a world where Corn and Sugar Cane has never been more important. I’m not talking about food consumption.</p><figure class="kg-card kg-gallery-card kg-width-wide kg-card-hascaption"><div class="kg-gallery-container"><div class="kg-gallery-row"><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/576px-CSIRO_ScienceImage_3020_Starch_to_Polymer.jpg" width="576" height="720" alt="Additive Manufacturing"></div><div class="kg-gallery-image"><img src="https://trends.techlab.works/content/images/2020/06/8611895213_d3e210a799_c.jpg" width="800" height="600" alt="Additive Manufacturing" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/8611895213_d3e210a799_c.jpg 600w, https://trends.techlab.works/content/images/size/w800/2020/06/8611895213_d3e210a799_c.jpg 800w"></div></div></div><figcaption>L - Sugar Cane Corn Starch(CSIRO); R - 3D printed PLA teapot(Creative Tools)</figcaption></figure><p><br>Looking back to my high school days in 2017, I took Chemistry for my HSC and we were introduced to Plastics. After endlessly studying and drawing a variety of plastics' chemical structures, I stumbled upon something called Polylactic Acid (PLA).</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://trends.techlab.works/content/images/2020/06/image.png" class="kg-image" alt="Additive Manufacturing"><figcaption>The PLA chemical structure.</figcaption></figure><p>What makes PLA different from other plastics, is that it is derived from renewable resources. These renewable resources include sugar cane or corn starch. This makes PLA a “bioplastic” as it is created from biomass, also making it biodegradable! What this means is that PLA naturally degrades when exposed to the environment. </p><p>For example, a bottle made out of PLA that is left in the ocean would typically degrade in 6 to 24 months. Compared to conventional plastics, which would take several hundred to a thousand years to degrade.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="459" height="344" src="https://www.youtube.com/embed/JgrevlhUSNI?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><figcaption>What is PLA? A nice video that explains PLA and its benefits.</figcaption></figure><p>Now why on Earth am I talking about PLA? Well, PLA is now used as filament material for 3D printing which is a form of Additive Manufacturing! </p><p>The benefits of PLA all apply to its use in 3D Printing. Additive manufacturing is essentially the process of building 3D objects by adding layer-upon-layer of material.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="480" height="270" src="https://www.youtube.com/embed/Kb0egLiolbo?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><figcaption>How a 3D Printed object is made.</figcaption></figure><p>So now we know what PLA is and its use with Additive Manufacturing, what are its applications?</p><p><strong>Education</strong><br>At the University of Sydney, 3D printing has been widely adopted and used. There are makerspaces all over campus, that allow Students to come in and print a 3D model. Subjects are now teaching students how to use CAD software to design their own objects for assessment work and print them! An example is a few students from my degree, Bachelor of Design Computing; who participated in a <a href="https://biodesignchallenge.org/university-of-sydney">Biodesign Challenge</a>. They created a <a href="https://www.sydney.edu.au/news-opinion/news/2018/06/12/sydney-students-design-smart-bandage-for-burns.html">Smart Bandage applicator solution</a> that was 3D printed as a prototype and showcased in New York!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://www.sydney.edu.au/dam/corporate/images/news-and-opinion/news/2018/june/Biodesign_HydroHeal%201.jpg/_jcr_content/renditions/cq5dam.web.1280.1280.jpeg" class="kg-image" alt="Additive Manufacturing"><figcaption>Smart bandage applicator designed by students from the University of Sydney.</figcaption></figure><p><strong>Household</strong><br>Additive manufacturing and 3D printing are now readily available for your everyday person! A 3D printer that is budget-appropriate and home ready like the Creality Ender 3 is at $320. You’ll be able to set up your 3D printer in no time and get started printing your favourite objects!</p><figure class="kg-card kg-image-card kg-card-hascaption"><img src="https://paper-attachments.dropbox.com/s_9A65502820FD3B105CBF25DE9320A948F8D71D63EF54191B766D652FD4558086_1583798074651_Screen+Shot+2020-03-10+at+10.52.53+am.png" class="kg-image" alt="Additive Manufacturing"><figcaption>This is from Amazon Australia, the price is in AUD. <a href="https://www.amazon.com.au/Creality-3D-Printer-Printing-Precision/dp/B07M9BF4GN/" rel="noreferrer nofollow noopener">https://www.amazon.com.au/Creality-3D-Printer-Printing-Precision/dp/B07M9BF4GN/</a></figcaption></figure><p><br><strong>Health and Medicine</strong><br>An example I want to show that is quite amazing is this:</p><figure class="kg-card kg-embed-card kg-card-hascaption"><div id="fb-root"></div>
<script async="1" defer="1" crossorigin="anonymous" src="https://connect.facebook.net/en_GB/sdk.js#xfbml=1&amp;version=v6.0"></script><div class="fb-video" data-href="https://www.facebook.com/usydtechlab/videos/863429803831893/"><blockquote cite="https://www.facebook.com/usydtechlab/videos/863429803831893/" class="fb-xfbml-parse-ignore"><a href="https://www.facebook.com/usydtechlab/videos/863429803831893/"></a><p>Arduinos are great! They can also do some amazing things on a budget. For those attending bean 2017 at the shine dome in Canberra, we will challenge you to design the next great wearable!</p>Posted by <a href="https://www.facebook.com/usydtechlab/">University of Sydney TechLab</a> on Monday, 4 December 2017</blockquote></div><figcaption>The CLAW: A robotic claw that is controlled by muscle flex.</figcaption></figure><p>It is a 3D printed arm attached to a motor that is controlled by an arduino with a muscle sensor. As the person flexes their muscle, the arm clenches and as the person releases, the arm also releases. I worked on this project with <a href="https://trends.techlab.works/author/jim/">Jim Cook</a> and it was 10 lines of code. Imagine the practicality and use of this on an industry-scale.</p><p><br>That’s it from me, hope you learnt something from this! </p>]]></content:encoded></item><item><title><![CDATA[What We’ve Learned From 7 Years of 360 Video]]></title><description><![CDATA[<blockquote>Recently we’ve had an opportunity to try out the new <a href="https://www.insta360.com/product/insta360-oner_twin-edition" rel="noreferrer nofollow noopener"><strong>Insta360 One R Twin</strong> </a>and it seems like a good opportunity to reflect on how far we’ve come, what we expect to see in the coming year or two, and what that means for the education sector.</blockquote><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe src="https://player.vimeo.com/video/383251195?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="Introducing Insta360 ONE R - Adapt to the Action"></iframe><figcaption><a href="https://www.insta360.com/product/insta360-oner_twin-edition">Insta360</a></figcaption></figure>]]></description><link>https://trends.techlab.works/what-weve-learned-from-7-years-of-360-video/</link><guid isPermaLink="false">5eed00908d6a620001a96797</guid><category><![CDATA[Extended Reality]]></category><dc:creator><![CDATA[Jim Cook]]></dc:creator><pubDate>Mon, 09 Mar 2020 02:00:00 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1520697830682-bbb6e85e2b0b?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<blockquote>Recently we’ve had an opportunity to try out the new <a href="https://www.insta360.com/product/insta360-oner_twin-edition" rel="noreferrer nofollow noopener"><strong>Insta360 One R Twin</strong> </a>and it seems like a good opportunity to reflect on how far we’ve come, what we expect to see in the coming year or two, and what that means for the education sector.</blockquote><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe src="https://player.vimeo.com/video/383251195?app_id=122963" width="426" height="240" frameborder="0" allow="autoplay; fullscreen" allowfullscreen title="Introducing Insta360 ONE R - Adapt to the Action"></iframe><figcaption><a href="https://www.insta360.com/product/insta360-oner_twin-edition">Insta360 One R Twin Edition</a></figcaption></figure><hr><img src="https://images.unsplash.com/photo-1520697830682-bbb6e85e2b0b?ixlib=rb-1.2.1&q=80&fm=jpg&crop=entropy&cs=tinysrgb&w=2000&fit=max&ixid=eyJhcHBfaWQiOjExNzczfQ" alt="What We’ve Learned From 7 Years of 360 Video"><p>Our first 360 camera was 4 <a href="https://www.elmousa.com/product/qbic-ms-1/" rel="noreferrer nofollow noopener"><strong>Elmo QBic MS-1</strong></a>’s in a plastic rig that positioned them for overlap. It was unwieldy, low resolution, required manual stitching in proprietary software, and at the time, difficult to import into a VR/XR Game development suite like Unity. We struggled to get good footage, but one group of students managed to make a pretty good lab tour, across a whole semester.</p><p>After the stitching nightmare of our first rig, we were looking for an all in one solution. This bought us to the <a href="http://www.360fly.com/" rel="noreferrer nofollow noopener"><strong>360 fly</strong></a>, much cheaper and in a much nicer form factor. We captured a lot of footage with this camera, though it only supported a 240 Degree Vertical field of view, careful production choices and positioning gave some impressive results, discounting this nausea inducing adventure (don’t have rapid janky movement in Virtual Reality).</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="480" height="270" src="https://www.youtube.com/embed/F0qoxfpidnI?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><figcaption>TechLab video footage using <strong>360 fly</strong></figcaption></figure><p>The resolution though, as you can see, left a lot to be desired. We tried the <a href="https://theta360.com/en/" rel="noreferrer nofollow noopener"><strong>Ricoh Theta 360</strong></a>, the <a href="https://www.samsung.com/global/galaxy/gear-360/" rel="noreferrer nofollow noopener"><strong>Samsung Gear 360</strong></a>, and the <a href="https://www.insta360.com/product/insta360-nano/" rel="noreferrer nofollow noopener"><strong>Insta 360 Nano</strong></a>; all were fine on stitching but lacking in resolution.</p><hr><p>Our next camera was the <strong>Nikon Key Mission 360</strong>. which came with 3 key benefits.</p><ol><li>4K resolution with onboard stitching.</li><li>Waterproof and submersible.</li><li>Sub $500 AUD price point.</li></ol><p>For a long time this was our go to camera and still has applications due to its waterproof nature. The quality was excellent but there was still on camera stitching that wasn’t too easy to control.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="480" height="270" src="https://www.youtube.com/embed/Hh6aPfOgkOI?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><figcaption>TechLab video footage using <strong>Nikon Key Mission 360</strong></figcaption></figure><p>You’ll notice in the clip above that  there are very obvious seam lines on the cabinet to your right and the chair to the left. These are exacerbated by the bright light coming through the stained glass in the Carillion room. A lesson learned, sometimes you don't get to chose your location, especially when developing education or documentary material, you will have limited control over the light.</p><hr><p>This brings us to our current camera. Once we started to enact the University's VR strategy, we knew we needed to be able to control all of the variables. we came up with a list of requirements:</p><ol><li>Maximum resolution, the bigger the better, with detail being the goal;</li><li>3D capability (more on this later);</li><li>3D Spacial audio;</li><li>Automatic Stitching that could also be tuned;</li><li>Long range remote control (you will notice I’m in the room with the Carillion above by necessity);</li><li>Full control of Video settings and real time preview;</li><li>Stabilisation;</li><li>Live streaming capability at high resolutions.</li></ol><p>There were maybe 3 or 4 cameras on the market that met the requirements, however after extensive research we landed on the <a href="https://www.insta360.com/product/insta360-pro2" rel="noreferrer nofollow noopener">InstaPro 360 2</a>. There’s nothing quite like 8K 3D VR at 60 FPS.</p><figure class="kg-card kg-embed-card"><iframe width="480" height="270" src="https://www.youtube.com/embed/CI1LzjdN1Q4?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></figure><p>This camera has become our mainstay, from the work we do in food science with Lion Nathan and One Harvest, or filming break through medical facilities like the CPC hybrid theater,  MRI-Linac or Heart tissue regeneration lab at Westmead hospital. It has journeyed to events and the content from this camera is consumed across the curriculum. However, it is expensive and that makes it hard to scale. Right now, we train and onboard users in its use, and do much of the post production ourselves.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="480" height="270" src="https://www.youtube.com/embed/X1vq8PixVc0?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><figcaption>TechLab video footage using <strong>Insta360 Pro 2</strong></figcaption></figure><hr><p>This is where the Insta 360 One R twin may come in - around 1/10th of the cost, with a tiny form factor. This might be the balance between professional quality 360 content within a University budget. The 3m selfie stick means this can be used to shoot in all sorts of circumstance, the weight is astounding and the built in AI for object tracking is truly astounding in such a small footprint. Seams are all but invisible, and the stabilization is wonderful.</p><figure class="kg-card kg-embed-card kg-card-hascaption"><iframe width="480" height="270" src="https://www.youtube.com/embed/zNdy4yf7FUw?feature=oembed" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe><figcaption>TechLab video footage using <strong>Insta360 One R Twin</strong></figcaption></figure><p>There is no doubt we will see more and more of this in classrooms over the coming years, immersive learning is becoming the norm, and this technology allows us to provide high fidelity experiences in environment that may be unsuitable or unsafe for students. While ideally students can experience workplaces and labs directly, this is the next best thing to providing insight into other environments when we have cohorts of 200 or 2000. Thanks to the ease of use, and price of equipment, 360 Video delivered in VR or online is now a sustainable solution and our university among others will adopt a VR/XR strategy to leverage it.</p><p>But what does that mean is coming soon? Go-pro has the Max on the market, and its a worthwhile competitor, but where they both fall down long term is battery life.  with a 256GB card, you are looking at several hours of footage, with a spare battery you can maybe run the device for 2 hours, editing and previewing on your phone is tedious and also drains its battery. <br><br>Battery is the big blocker for a lot of the technologies we work with in the TechLab, wearables, XR Headsets, Mobile Phones, and Cameras will all be improved by the next generation of battery. Lithium Sulfur is promising but is still several years away, Lithium solid state will arrive sooner and this is being driven by safety concerns, thanks to airlines banning Li-Po and Li-ion in luggage among other similar situations. Without delving too far into the future, <a href="https://www.pocket-lint.com/gadgets/news/146891-power-for-future-devices-could-be-harvested-directly-from-wi-fi-signals" rel="noreferrer nofollow noopener">ambient electric harvesting of radio waves</a> and <a href="https://onlinelibrary.wiley.com/doi/full/10.1002/aenm.201802190" rel="noreferrer nofollow noopener">human nano-generators</a> are all coming down the pipeline though are likely 10 years away.</p><p>For now, it's a pleasure to use both the Insta 360 Pro 2 and the Insta One R Twin. Summing up the lessons of the past 7 years I’d say our real lessons are with scalability and process, while previously the TechLab or Immersive Learning Lab had to be greatly involved in order to ensure quality production, now with 5 minutes training anyone can produce a 360 educational video easily and cheaply.</p>]]></content:encoded></item><item><title><![CDATA[Explained: Extended Reality]]></title><description><![CDATA[<figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2020/06/XR_TechLab_Infographic.jpg" class="kg-image" alt srcset="https://trends.techlab.works/content/images/size/w600/2020/06/XR_TechLab_Infographic.jpg 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/XR_TechLab_Infographic.jpg 1000w"></figure>]]></description><link>https://trends.techlab.works/extended-reality/</link><guid isPermaLink="false">5eed00908d6a620001a96793</guid><category><![CDATA[Extended Reality]]></category><dc:creator><![CDATA[TechLab]]></dc:creator><pubDate>Tue, 18 Feb 2020 04:32:14 GMT</pubDate><media:content url="https://images.unsplash.com/photo-1538388149542-5e24932d11a8?ixlib=rb-1.2.1&amp;q=80&amp;fm=jpg&amp;crop=entropy&amp;cs=tinysrgb&amp;w=2000&amp;fit=max&amp;ixid=eyJhcHBfaWQiOjExNzczfQ" medium="image"/><content:encoded><![CDATA[<figure class="kg-card kg-image-card"><img src="https://trends.techlab.works/content/images/2020/06/XR_TechLab_Infographic.jpg" class="kg-image" alt="Explained: Extended Reality" srcset="https://trends.techlab.works/content/images/size/w600/2020/06/XR_TechLab_Infographic.jpg 600w, https://trends.techlab.works/content/images/size/w1000/2020/06/XR_TechLab_Infographic.jpg 1000w"></figure>]]></content:encoded></item></channel></rss>