Home DamienG

Web Name: Home DamienG

WebSite: http://damieng.com

ID:186370

Keywords:

Home,DamienG,

Description:

A good SDK builds on the fundamentals of good software engineering but SDKs have additional requirements to consider.Why is an SDK different?When developing software as a team a level of familiarity is reached between the team members. Concepts, approaches, technologies, and terminology are shaped by the company and the goals are typically aligned.Even as new members join the team a number of 1-on-1 avenues exist to onboard such as pairing, mentoring, strategic choices of what to work on etc.Software to be consumed by external developers is not only missing this shared context but each developer will have a unique context of their own. For example: What you think of as an authorization request as a domain expert is unlikely to match what a user thinks authorization means to their app.The backgrounds of developers can be diverse with varying abilities and requirements each shaped by their experiences with other software and the industries they’ve worked in.Onboarding potential customers, developers, or clients with 1-on-1 support simply doesn’t scale and the smallest bump in the road can lead them down a different path and away from your service.GoalsA guiding principle for developing software is to be user-focused throughout.This is especially important when developing an SDK and yet is more easily overlooked as it is created by a developer for a developer.It is important to remember that you are not your audience.You have in-depth knowledge of the how and why that the user is unlikely to have. Indeed they may not care or even want to learn - they have work of their own to be doing delivering the unique functionality of their solution. If they had the interest, time or experience to deal with the intracacies your library is supposed to take care of they wouldn’t need it.Some more specific goals to follow are:Reduce the stepsEvery step is another opportunity for the developer to get confused, distracted or disenfranchised.The primary technique for achieving this is to utilize defaults and convention liberally. Default to what works for the majority of people while still being secure and open to customization.In the case where multiple steps are required consider combining those steps into a use-case specific flow.Success can be delivered in parts.If a user can try a default configuration and see that part working it provides encouragement and incentive to keep going on to the next requirement they have. Each success builds upon the previous to keep the user on-track and invested in this solution.Simplify concepts and terminologyTerminology is essential to a deep understanding of any field however it can become a massive barrier to adoption for those less versed in the topic.It is important to use phrases, concepts and terminology your audience will understand rather than specific abstract or generic terms defined in underlying RFCs or APIs. This should be reflected in the primary class names, functions, methods, and throughout the documentation.When it is necessary to expose less-common functionality you should strive to progressively reveal the necessary detail. In cases where this exposure provides facilities close to the underlying implementation then it becomes advantageous to revert back to the terminology used there for advanced operations.Guide API discoveryMany platforms and languages have facilities that can be utilized to help guide API discovery primarily through autocompletion mechanisms such as Visual Studio’s IntelliSense.Common functionality should flow-out from the objects or functionality the developer has at that point. If your API has provided a connection to your service you should not then expect them to go and discover all new objects and namespaces.Many popular pieces of software like to provide the developer with a “context” object. This is an object that exists only for the current request (in web server applications) or current developers instance of the app (in client applications) that could provide access to the current user and the various operations available - authorizing, configuring, performing api requests etc. The act of obtaining this context in an for example an identity SDK could be logging in.Namespaces can be used to push more advanced functionality away from newer developers allowing them to concentrate on primary use-cases which can be combined into a single common root or default namespace.The same principle applies to methods and properties especially in strongly-typed languages with rich IDE support. A fluent-style API for optional configuration can not only guide you through the available options but can also prevent you from making incompatible choices right at compile time where it is safe to provide detailed messages in context to the line of code.Look nativeDevelopers often specialize in just one or two platforms at any one time and become intimately familiar with the design and flavor of those platforms.SDKs should strive to feel like native citizens of that ecosystem adopting the best practices, naming conventions, calling patterns and integrations that the user. Cross-platform solutions that look the same across platforms are only of interest to other cross-platform developers.When using your API developers should be able to anticipate how it will function, how error handling, logging, and configuration will work based on their experience with the platform. Take time to understand both what is popular on that platform and which direction things are moving when making choices.Feeling native further reduces the barrier to entry and can replace it with a feeling of “It just works!”.Resist the temptation to make your SDKs work the same way across platforms when it goes against the grain of that platform. It might make things easier for you and your team but the pain will be pushed onto consumers of your SDK and waste their resources every time your API behaves in a way that is unintuitive to people familiar with the platform.Inline documentationOnline documentation provides a great place for both advanced topics that require multiple interactions as well as letting new developers see what is involved before they switch into their IDE or download an SDK.However, when using the code itself the SDK should put concise documentation about classes, functions and parameters at their fingertips where possible. Many IDEs provide the ability to display code annotations e.g. XML Documentation Comments in C# and JSDoc.This should be leveraged to keep developers engaged once they start using the SDK. Switching to a browser to read documentation presents them with tabs of other things needing attention or other solutions that don’t involve using your SDK.StrategiesConsume an API you wish you hadIt can be incredibly advantageous to start by writing the code a user might expect to write to perform the operation in their application.You want the code to be aligned with the goals:Start with a small fragment that exercises a specific scenario to both prove it and provide a real-world-like snippet for the documentation.Better yet adopt or develop a small reference application and show the SDK working as a whole. Such an app can also be published itself as a great reference for programmers looking for concrete examples or best practices and form the basis of starters, examples and tutorials.These applications also have further long-term value in that they can be used to: See if and how the SDK breaks applications when changes are introduced Form ideas about how deprecations are handled Prove (or disprove) how a proposed SDK change improves the experienceTraditional up-front design requires you trade off how much research you do before you start designing the system. Undoubtedly no matter how much research you do it will never be enough. Either there are use cases that were missed, small details that negatively impact the design, or implementation constraints that go against the design in awkward ways.It is important to approach writing a library by implementing pieces one at a time continually refining and refactoring as you go along to ensure that you end up with a design that fits both the problem domain and the unseen constraints either the technology or the intricacies the domain imposes that would have no doubt been missed in an up-front design.Sample applications and references can really help shape the good design as you go by reflecting how changes to the SDK affect your applications.Breaking clients is to be avoided where possible so a design should be refined as much as possible before initial release given both the current constraints and future direction. If a piece of the SDK is not well baked strongly consider keeping back the unbaked portions so that the primary developers do not take a dependency on it yet.Other avenues are available to help bake the design and functionality of new components such as forums, private groups, internal teams or other team members both on this SDK or on others. Design shortcomings are much easier to spot when you are distanced from its creation.Local git branches are a vitally important safety net to aggressive refactoring - commit after each good step.Note about unit testsUnit tests are very important for production quality software, ongoing maintenance and even late stage refactoring however developing unit tests too early in the development cycle can work against aggressive refactoring and redesign.It is difficult to move functionality, fields and methods around between classes and methods when there are a multitude of unit tests expecting them there especially given that the unit tests at these early phases tend to be nothing more than checking the most basic of functionality.You should also pay attention to how much influence the unit tests themselves are exerting on the design of the SDK components themselves and whether this makes those components a simpler design for consumers or pushes more requirements to them on how to make many small testable pieces work together as a simple single component to solve traditional use-cases.Layered designSome of these goals can be difficult to implement in a single design. For example, how do you: Ensure that SDKs that appear so different are approachable by engineers at Auth0? Avoid a multitude of options when there are choices that need to be made? Provide the ability for advanced consumers to go beyond the basics? A dual-layer design can work very well for client SDKs and help solve these problems.The lower levels of the design are unit-testable building blocks that very much mirror the underlying APIs, concepts, and RFCs. Classes are focused on a specific task or API but take a variety of inputs ensuring the right kind of request is sent with the right parameters and encodings. This gives a solid foundation to build on that is well understood by the team as well as across SDKs.The high-level components are responsible for orchestrating the lower pieces into a coherent native-friendly easy-to-use experience for a majority of use cases. They form the basis of quick-starts, initial guidance, and tutorials. Other high-level components may in fact form plug-ins to existing extensions provided by the environment.This layering also helps developers perhaps unfamiliar with the platform clearly see how the underlying familiarly-named elements are utilized in particular environments or flows.When consumers need to go beyond the capabilities of high-level components the same underlying building blocks used by the high-level components are available to them to compose, reuse and remix as they need.For example: A C# SDK might include a high-level component for desktop apps that automatically become part of application’s startup and shutdown as well as provides support for opening login windows. Another high-level components might be developed for server-to-server communication and know how to check common configuration patterns such as Azure application settings or web.config files and deal with secure secrets.Each is tailored specifically to the use case but use the same underlying blocks to achieve the result.It is also advantageous in environments that support package management to individually package environment-dependent parts with clear labels that describe their use in that environment. This aids in both discovery and ensures the SDK does not bring along additional unneeded sub-dependencies for that environment while bringing along the core shared lower level package. I’ve been doing a fair amount of work in Nuxt.JS of late. I’d previously used Next.JS but found React not to my liking and that Vue just fit better with my mental model/workflow.While they are great for accelerating development there are definitely some areas where it’s made things a little trickier. One such area is when you wish to drop some “simple” client-side JavaScript in - here’s the process I went through to get Google Trends to work.It looks like this:Google Trends is a great way of visualizing what trends to searches are over time. The default embed script tho will not work in your Nuxt.JS app.If you paste it as-is you will see the error:Uncaught SyntaxError: expected expression, got ' 'So we should tell Nuxt.JS to run the script only on the client using the client-only tag. Now you will see:Uncaught ReferenceError: trends is not definedIt appears that the second part of the embed code Google provides is trying to run before the first is ready. I’m not sure why this happens inside Nuxt.JS/Vue and not in a normal browser session but we can fix that by moving the initialization code into a function and then calling that function using the onload event on the primary script.Now you will experience one of two things - either the page will reload with only Google Trends and not your content or you’ll get an error about the DOM being changed. This is due to the use of document.write in the default embed code.Thankfully Google includes a renderExploreWidgetTo function which leads us to…The solutionInstead of using the default embed code, adapt this version to your needs replacing just the contents of the initTrends function with the part from your Google Trends embed code and viola! template main div id="trendChart" /div client-only script function initTrendChart() { trends.embed.renderExploreWidgetTo( document.getElementById("trendChart"), // Replace this block with yours "TIMESERIES", comparisonItem: [ keyword: "ZX Spectrum", geo: "", time: "2004-01-01 2021-07-01", category: 0, property: "", exploreQuery: "date=all amp;q=ZX+Spectrum", guestPath: "https://trends.google.com:443/trends/embed/", /script script type="text/javascript" src="https://ssl.gstatic.com/trends_nrtr/2578_RC01/embed_loader.js" onload="initTrendChart()" /script /client-only /main /template If you need multiple charts you’ll need to create multiple divs and paste multiple blocks into initTrendChart to ensure they are all initialized. You do not need multiple copies of the embed_loader script.This also works just fine with markdown files used to render content via Nuxt Content.Enjoy![)amien Breaking changes are always work for your users. Work you are forcing them to do when they upgrade to your new version. They took a dependency on your library or software because it saved them time but now it’s costing them time.Every breaking change is a reason for them to stop and reconsider their options. If your library is paid-for or the preferred way for paying users to access your services then lost users can come with a real financial cost.Use breaking changes sparingly.Good reasons for breaking changes A feature is being removed for business reasons Standards are changing how something works The feature did not work as intended and it’s impossible to fix without a break Service or library dependencies you rely upon are forcing a change A small breaking change now prevents actual significant breaking changes laterEven when presented with these you should think not only about whether you can avoid a break but also take the opportunity to think about what you can do now to avoid similar breaks in the future.Poor reasons for breaking changes It makes the internals of the library tidier Parameter order, method naming or property naming would be ”clearer” Consistency with other platforms or products Personal subjective interpretations of “better” Compatibility with a different library to attract new usersWhile many of these are admirable goals in of themselves they are not a reason to break your existing customers.Managing breaking changesIt goes without saying that intentional breaking changes should only occur in major versions with the exception of security fixes that require users of your library take some action.Here are some thoughts to ease the pain of breaking changes: List each breaking change in a migration guide with a before and after code fragment Summarize breaking changes in the README with a link to the migration guide for more information Keep breaking change count low even on major releasesUsers should ideally also be able to find/replace or follow compiler errors. Consider: Platform-specific mechanisms for dealing with breaking changes. e.g. in C# you can use the [Obsolete] attribute to help guide to the replacement API, Java has the @deprecated annotation. Leaving a stub for the old method in place for one whole major release that calls the new method with the right arguments and produces a log warning pointing to the migration guide.If a package is drastically different users will need to rewrite code. This is always a bigger deal for them than you expect, because: It is unscheduled and was not on their radar (no they are not monitoring your GitHub issue discussions) They use your library in ways you likely don’t expect or anticipate The more they depend on your product then the more work your rewrite involvesReally consider whether your new API is actually better (ideally before you ship it). One way to do this is to produce a set of example usage code for using the old library vs the new library. Put them side-by-side and open them up to feedback. Is the new API genuinely better? Some indicators it might be are: Easier to test and abstract Existing customers prefer the new syntax and think it’s worth changingSome indicators that it isn’t really any better is: internal staff prefer it, it aligns better with some other platform or just being different.Sometimes it’s worth doing because it targets a different crowd or comes at the problem from a simpler direction or abstraction. If so, then seriously consider giving it a new package name and put it out for early access.Make sure users are led to the right library of the two and if there is a lot of code duplication and users on the “old” library then consider making the next version of the old library a compatibility wrapper around the new library.[)amien I’m often digging into old bitmap font and UX design out of curiosity - and someday hope to revive a lot of these fonts in more modern formats using a pipeline similar to that for ZX Origins so we can get all the usable fonts, screenshots etc. out of them.One limitation I’ve run into is digging into old Macintosh fonts. While James Friend’s PCE.js puts System 6 and System 7 at your fingertips when it comes to later 7.5, 8 or 9 the site doesn’t have you covered as PCE doesn’t support PowerPC emulation (it handles Motorola 68000 and Intel 8086 processors).This is a shame for me as that’s where the interface started diverging by adding color and some more interesting fonts. Additionally some third-party fonts are distributed in .sit (StuffIt Expander) or only work with later Mac OS versions.Enter QEMUThankfully QEMU has us covered. It’s an open-source emulator that unlike regular virtualization tools is quite capable of emulating completely different CPU architectures from ARM through to MIPS, PowerPC, RISC-V, Sparc and even IBM’s big s390x z/Architecture.With such a wide variety of options and settings available you can imagine it will require some digging through the user interface and you’d be wrong. There is no GUI however and the third party ones that exist mostly seem to be from 10 to 2 years out of date and many don’t support Windows at all. The only “up to date” one I found - QtEmu - only supports configuring x86 virtual machines.This is a shame as although I love the command-line for its scriptability when it comes to exploring valid combinations of options the command-line is mostly awful (the IBM AS/400 command-line and prompting system excluded).You could try and build this yourself but Stefan Weil has you covered for pre-built QEMU Windows binariesPlease note that Mac sound support is missing here. There are “screamer” forks but the only binaries available are for Mac OS X so you’d have to build it yourself and there will probably be a whole lot of hoops to jump through.Obtaining an OS install imageTo install Mac OS 9 we’re going to need a disk image/ISO to install from.The nice people over at Mac OS 9 Lives have a Mac OS 9.2.2 Universal Installer ISO which is pre-configured and easy to use - it also conveniently includes a few extra tools and apps you’ll need.While Mac OS is copyrighted this image has been up for over 6 years so I like to think Apple are turning a blind eye in that people who want to use their legacy stuff can do so without expecting support from Apple - win-win. They also don’t charge for their Operating Systems instead it’s “free” with the hardware and I still have a MacBook Pro 15” so I won’t feel bad about using it. Your mental mileage may vary.Creating a machine installing Mac OS 9First off create a new folder to put your machine config into. (Windows won’t let you stuff it into Program Files). I’ve chosen c:\retro\mac.Now let’s create an empty hard-drive image file:cd "c:\program files\qemu"qemu-img create -f qcow2 c:\retro\mac\MacOS9HD.img 5GThis creates a virtual hard-drive that can grow up to 5 GB in size and will allocate disk as it needs (copy-on-write). After installation this file will grow to about 660 MB.Now before we go further remember the keyboard-shortcuts: CtrlAltG as you’ll need it to get the mouse back CtrlAltF to get you in and out of full-screen modeqemu-system-ppc -cpu "g4" -M mac99,via=pmu -m 512 -hda c:/retro/Mac/MacOS9HD.img -cdrom "c:/retro/Mac/Mac OS 9.2.2 Universal Install.iso" -boot d -g 1024x768x32 -device usb-kbd -device usb-mouse -sdlThis specifies that we want to use: a PowerPC G4 900MHz CPU -cpu g4 a PowerMAC based Mac with USB support -M mac99,via=pmu 512 MB of RAM -m 512 Our hard-drive image -hda xxx Our Installation CD mounted -cdrom xxx Boot from CD -boot d 1024x768 32-bit display -g 1024x768x32 a USB keyboard -device usb-kbd a USB mouse -device usb-mouse SDL display buffer -sdlThere are many other useful config switches available for the PowerPC emulation if you need to troubleshoot or tweak. The final item, SDL, is required because the default GTK emulation, while faster, has major problems on Windows trying to keep the mouse captured.You should now be presented after a few seconds with a ReadMe. Just close that with the top-left window control then click into the Drive Setup window, select not initialized , press the Initialize button then confirm it with the default on the subsequent Initialize window.You might now want to click the “untitled” hard drive icon that’s appeared on the desktop, wait a few seconds and you should be able to rename it. Typically Macintosh HD is a popular choice.Head up to the MacOS9Live CD icon, double click it then double-click on Apple Software Restore. The following Window will appear. You can accept all the defaults or just change Volume Format to Extended - I did this just in case I want to try and mount the image on my MacBook at a later date.Clicking Restore, then confirming the dialog will give you a progress bar that is comically fast for installing an operating system (via software emulation no less).Now you’ll need to head to the Special menu and choose Shut Down.Using our virtual Mac OS 9Finally, we want to start our freshly created machine without booting from the ISO. The command line is mostly the same just omitting the ISO and boot-from-CD options:qemu-system-ppc -cpu g4 -M mac99,via=pmu -m 512 -hda c:/retro/Mac/MacOS9HD.img -device usb-kbd -device usb-mouse -sdlYou’ll probably want to put that in a shortcut icon.When it boots for the first time you’ll get a Register With Apple “wizard”. Just press WindowsQ to quit this and get to that Platinum desktop!Head to the Control Panel’s Monitors applet to set the screen size/resolution you want. You might also want to head into Appearance applet’s Fonts tab to turn off anti-aliasing so you can enjoy the fonts in their pixel-glory. (You can also switch from the revised Charcoal front back to the classic Chicago font here). You may also have to switch screen-resolution again if you see some odd artifacts/missing/doubled pixels when turning it off. (There’s a quick resolution changer on the control-strip in the lower left, it’s the one with the checkerboard effect)Remember to always shut-down correctly! Use the switcher at the top-right to “switch” to Finder then go through that Special, Shutdown process each time. QEMU will close several seconds after it’s complete.On your hard-drive you’ll find an Applications folder, dig into Internet Utilities, Classilla folder and you can launch Classilla which is a port of the Netscape browser made in 2014 (based on Netscape Navigator 1.3.1 Nokia N90 port). It was a valiant effort given how diferent Mac OS development was prior to Mac OS X - there were no Unix libraries/support so ports were difficult and most applications were written in MetroWerks CodeWarrior - the “classic” Mac OS version was discontinued in 2002.Still Classilla is much better than IE 5.5 which fails to do anything at all. Google works, for example, but many sites don’t render at all because of the push to later versions of SSL the browser does not support.You can find a ton of old Mac software at The Macintosh Repository but there are no more capable browsers.Still, it’s a fun environment to play with and it’s nice to have 100% accurate references to Geneva, Chicago, Monaco, Espy Sans etc. as most “conversions” tend to be hand-converted and mistakes are a-plenty. I’ve done a few conversions myself this way on FontStruct and know how easy it is to make mistakes when working from screenshots especially when it comes to spacing between letters.It’s also nice to see an old friend again. Despite regularly finding myself on retro machines and emulators spanning 8 and 16 bit machines I don’t have (or have the space for) a classic Mac and emulation has been difficult. I think I last used Mac OS 9 in 2000 on an iMac at work before we put the Mac OS X Public Beta on it (I was a bit NeXT/OpenStep fan and wanted to see what they had done to it!)My thanks to James Badger for his general article on Mac OS 9 on QEMU.[)amien

TAGS:Home DamienG 

<<< Thank you for your visit >>>

Damien Guard on software development, fonts and technology.

Websites to related :
Agence du château - Compiègne

  Maison 5 pièces Compiègne (60200) 238 000 € en savoir plus Deux agences immobilières indépendantes et familiales à votre service depuis plus de

Translation Office: Business, le

  REGAZZONI - SIERAKOWSKIBusiness, legal and technical translationsfrom : English, French, Italian and Spanishinto : Italian, French and English

Traduzione e asseverazione di og

  Traduzioni rapide ed economiche? Ok il posto è giusto! Ti offriamo il meglio: precisione, puntualità, competenza e risparmio assicurato… mettici al

ARPA FVG - Agenzia Regionale per

  Dopo la polvere di stelle della notte di San Lorenzo, che quest anno ha avuto il suo picco nella notte tra il 12 e 13 agosto, un altro tipo di polvere

FJRN: The Best in Scandinavian D

  Aino Aalto Finland Hans J. Wegner Denmark "The good chair is a task one is never completely done with"-Hans J. Wegner Ingegerd R man Sweden The ob

Covenant College - Official At

  2021 | Covenant College | 14049 Scenic Highway | Lookout Mountain, GA 30750 | 706.820.1560 2021 | Covenant College | 14049 Scenic Highway | Lookout M

Design to Impress | Toronto Home

  Interested in staging a home in Toronto or the GTA? DESIGN TO IMPRESS are an unbiased and independent team of home stagers in Toronto. This home stagi

Ultraculture | Unleash Your Magi

  ClassesBooksPodcastAngels are Mathematical: How Angels and Secret Societies Shape HistoryFebruary 26, 2019 by Jason Louv Leave a Comment In which we g

Home - Australian Sailing - Disc

  Set sail today Sailing is for people of all ages and abilities. Check out our programs and find one near you! Each program (Tackers, Dinghy, OutThere

Gear Hones, Gear Honing Tools, G

  Gear Hones - Gear Hobbing Machines - Gear Testing Machines Gear Measuring-Inspection Machines - Gear Inspection Services Welcome to Involute Gear and

ads

Hot Websites