Disclaimer: this is an automatic aggregator which pulls feeds and comments from many blogs of contributors that have contributed to the Mono project. The contents of these blog entries do not necessarily reflect Xamarin's position.

March 2

Mono Summer of Code 2015!

Hey everyone! The Mono team is pleased to announce that we are a mentor organization in the Google Summer of Code 2015! This is the eleventh year of Summer of Code for us, and we're really excited to work with a new group of students.

This is a great opportunity to spend the summer with a great community working on cutting edge open-source C# tools and frameworks. You can hone your development skills by working on large and complex codebases with experienced mentors, and get paid for your hard work too.

If you're an eligible student, the application period runs from March 15-27. But don't let that stop you from starting on your proposals! Feel free to introduce yourself to the community and mentors, talk about your ideas, and do some preliminary research to make your proposal as strong as it can be. If you're feeling particularly ambitious, you could even get started on some quick bugfixes and patches to show off your skills; while this isn't required, it is really helpful in seeing how you work and getting your name out in the community. Show us how excited you are about coding!

Same as last year, our project ideas and rules are available on our GSoC ideas page, and we'll be updating the list as we come up with new ideas. Don't let these ideas limit you though; if you have your own idea for a great project for the summer, put it in a proposal and send it our way. Or, if you can't decide, you can always submit multiple proposals. Keep in mind, though, quality is better than quantity in this case.

Our project mailing lists should be your first stop for questions about contributing to Mono. There are many lists for different topics, but the main ones are monomono-devel and monodevelop-devel. For external projects, you should also contact the developers in their project mailing lists.

And of course IRC is where you can find everyone online, on the irc.gnome.org server. There's the #mono channel for general Mono discussions, #monodev for Mono development, #monodevelop for MonoDevelop and Xamarin Studio, and #monosoc for Summer of Code-specific questions and saying "Hi" to your fellow students. Hang around a while after asking a question - we have mentors in many timezones so they may be asleep or busy when you visit.

If you're not a student, you can participate in Summer of Code by helping the students feel welcome in our community! Or, if you're interested in mentoring C# tools and libraries under the Mono umbrella, send an email to the Mono GSoC administrator at soc@xamarin.com.

To stay up to date with the applications process and the work of our students, follow us on Twitter and Google+.
Good luck, and here's to another great summer of coding!

Follow the Rainbow to Xamarin Events!

Find your lucky charm at the end of the rainbow with Xamarin at this month’s developer group events!

meetup-banner March

DFW Mobile .NET us

  • Irving, TX: Thursday, March 5th 6:00PM
  • Automated UI Testing and Monitoring for your Mobile Apps w/ James Montemagno

Kansas City Mobile .NET Developers Group us

  • Kansas City, MO: Tuesday, March 10th 6:00PM
  • Intro to Apple’s WatchKit using Xamarin

Visual Studio ALM Days 2015 __

  • Düsseldorf, Germany: March 11- 12th
  • Automating UI Testing and Xamarin Test Cloud with Mike James

Birmingham Xamarin Mobile Cross Platform User Group __

  • Birmingham,United Kingdom: Wednesday, March 11th 6:30PM
  • A deep dive into Xamarin.Forms

Northern Virginia Mobile C# Developers’ Group us

  • Mc Lean, VA: Wednesday, March 11th 7:00pm
  • Code-sharing with MVVM Cross and Xamarin

Xamarin Meetup San Diego us

  • San Diego, CA: Thursday, March 12th 6:30PM
  • Xamarin User Group Kick-Off!

London: DEVWEEK 2015 __

  • London,United Kingdom: March 23 – 27th
  • MVVM Pattern and Xamarin.Forms workshops with James Montemagno

Windows App London __

  • London,United Kingdom: Wednesday, March 25th 6:30PM
  • What you want to know about Xamarin

Atlanta Xamarin Users Group us

  • Atlanta, GA: Monday, March 30th 6:30PM
  • Intro to Xamarin Forms and Bluetooth LE

Didn’t see your city listed above? Not to worry, look to this Events Forum for even MORE upcoming Xamarin events, meetups, and presentations happening every day!

Want a developer group in your area and are interested in getting one started? We’re here to help! Here’s a tips and tricks guide on staring a developer group, a brand new introduction to Xamarin slide deck, and of course our community sponsorship program to get you on your way. Also, we love to hear from you so please feel free to send us an email or tweet @XamarinHQ to help spread the word and continue to grow the Xamarin community.

Bamboo: Automatic merging of Plastic SCM branches

Atlassian's Bamboo is one of the most popular Continuous Integration servers, used by big enterprises and small startups alike. Of course, we released our own plugin to integrate Plastic SCM as a valid repository source in Bamboo quite a long time; however, you might have noticed that it didn't support an interesting Bamboo feature: automatic branch merging.

Well, not anymore. Since our release 5.4.16.647, we've extended our Bamboo plugin and adapted our client core to provide this option! But don't leave just yet. We'll show you a quick example of how to allow Bamboo to automatically merge your Plastic SCM branches, using our lightning-fast Plastic SCM Merge Machine behind the scenes.

February 27

Give Us the Gist of It Contest Winners!

Two weeks ago, we asked the community to share the code snippets that help them write amazing apps even faster. Five winners were chosen at random, and here is the gist of it:

Jason Fox:
Snippet Name: Xamarin.iOS Image Blur
Platform: Xamarin.iOS
Function: Image blur extension method for Xamarin.iOS

public static UIImage Blur(this UIImage image, float blurRadius = 25f)
{
  if (image != null)
  {
    // Create a new blurred image.
    var imageToBlur = new CIImage (image);
    var blur = new CIGaussianBlur ();
    blur.Image = imageToBlur;
    blur.Radius = blurRadius;
    var blurImage = blur.OutputImage;
    var context = CIContext.FromOptions (new CIContextOptions { UseSoftwareRenderer = false });
    var cgImage = context.CreateCGImage (blurImage, new RectangleF (new PointF (0, 0), image.Size));
    var newImage = UIImage.FromImage (cgImage);
    // Clean up
    imageToBlur.Dispose ();
    context.Dispose ();
    blur.Dispose ();
    blurImage.Dispose ();
    cgImage.Dispose ();
    return newImage;
  }
  return null;
}

Runar Ovesen Hjerpbakk:
Snippet Name: Async and await together with UIAlertController
Platform: Xamarin.iOS
Function: This snippet shows how to use a TaskCompletionSource to enable async and await together with UIAlertController.

public static class CustomerFeelingSheet {
 public static Task<CustomerFeeling> ShowRatingDialogAsync(UIViewController parent) {
   var taskCompletionSource = new TaskCompletionSource<CustomerFeeling>();
   var alert = UIAlertController.Create("howDoYouFeel".T(), null, UIAlertControllerStyle.ActionSheet);
   alert.AddAction(UIAlertAction.Create("likeIt".T(), UIAlertActionStyle.Default,
       a => taskCompletionSource.SetResult(CustomerFeeling.LikeIt)));
   alert.AddAction(UIAlertAction.Create("couldBeBetter".T(), UIAlertActionStyle.Default,
       a => taskCompletionSource.SetResult(CustomerFeeling.CouldBeBetter)));
   alert.AddAction(UIAlertAction.Create("cancel".T(), UIAlertActionStyle.Cancel,
       a => taskCompletionSource.SetResult(CustomerFeeling.DontCare)));
   parent.PresentViewController(alert, true, null);
   return taskCompletionSource.Task;
 }
}

Matthieu Guyonnet-Duluc:
Snippet Name: Android Behavior – WPF Style
Platform: Xamarin.Android
Function: Reproduce the beloved WPF behaviors in Android

<com.mycompany.behaviors.ListViewHideKeyboardOnScroll
        android:layout_width="0px"
        android:layout_height="0px"
        local:View="@+id/resultsList" />
    public class ListViewHideKeyboardOnScroll : Behavior<AbsListView>
    {
        public ListViewHideKeyboardOnScroll(Context context, IAttributeSet attrs)
            : base(context, attrs)
        {
        }
        #region implemented abstract members of Behavior
        public override void OnAttached()
        {
            View.ScrollStateChanged += HideKeyboard;
        }
        public override void OnDetached()
        {
            View.ScrollStateChanged -= HideKeyboard;
        }
        #endregion
        void HideKeyboard(object sender, AbsListView.ScrollStateChangedEventArgs e)
        {
            if (e.ScrollState == ScrollState.TouchScroll)
            {
                var inputManager = (InputMethodManager)this.Context.GetSystemService(Context.InputMethodService);
                inputManager.HideSoftInputFromWindow(View.WindowToken, HideSoftInputFlags.None);
            }
        }
    }

Ken Pespisa:
Snippet Name: SQLite Extension methods for Save & Delete
Platform: Xamarin.iOS
Function: Save the specified entity by calling insert or update, if the entity already exists.

public static class SQLiteExtensions
{
   /// <summary>
   /// Save the specified entity by calling insert or update, if the entity already exists.
   /// </summary>
   /// <param name="pk">The primary key of the entity</param>
   /// <param name="obj">The instance of the entity</param>
   /// <typeparam name="T">The entity type.</typeparam>
   public static int Save<T>(this SQLiteConnection db, object pk, object obj) where T : new()
   {
       if (pk == null || db.Find<T>(pk) == null)
       {
           return db.Insert(obj);
       }
       return db.Update(obj);
   }
   /// <summary>
   /// Delete entities based on a predicate function
   /// </summary>
   /// <param name="predicate">The predicate specifying which entities to delete</param>
   /// <typeparam name="T">The entity type.</typeparam>
   public static void Delete<T>(this SQLiteConnection db, Expression<Func<T, bool>> predicate) where T : new()
   {
       var records = db.Table<T>().Where(predicate).ToList();
       foreach (var record in records)
       {
           db.Delete(record);
       }
   }
}

Ryan Davis:
Snippet Name: InlineTableViewSource
Platform: Xamarin.iOS
Function: A subclass of UITableViewSource that allows you to define UITableViewDataSource and UITableViewDelegate methods inline, rather than subclassing.

var cellId = new NSString("cell");
var tableView = new UITableView(View.Frame, UITableViewStyle.Grouped) {
    Source = new InlineTableViewSource {
            _NumberOfSections = (tv) => 2,
            _RowsInSection = (tv, section) => 5,
            _TitleForHeader = (tv, section) => String.Format("Section {0}", section),
            _GetCell = (tv, indexPath) => {
                var cell = tv.DequeueReusableCell(cellId) ?? new UITableViewCell(UITableViewCellStyle.Default, cellId);
                cell.TextLabel.Text = "hi";
                return cell;
        }
    }
};

Find even more speedy code snippets for your apps in the Get The Gist forum thread, and a big thanks to all who participated in the Give Us the Gist of It Contest!

Unity at GDC 2015

GDC is nearly upon us! It’s crazy hectic getting ready for such a big show but always an incredibly exciting week where we get to meet with so many of our current development community and meet new friends. As you might imagine, we’ve got a lot going on at the show! Here’s a little bit about it.

Unity Special Event

On Tuesday March 3 at 8:30AM PST, we’ll be holding live from San Francisco a special event to kick our GDC off. We’ll be sharing some big news, showing some beautiful demos, and inviting some special guests from the development community on stage.

Event details to come soon on our social media channels, stay tuned!

Unity Party

Don’t miss the Unity Party Wednesday night!
Register here: http://unity-gdc2015-party.eventbrite.com

Unity Dev Day

For those who chose an all access or summit & tutorial conference pass, don’t miss the “Unity Developer day”, Tuesday, March 3rd from 10:00am to 5:30pm in Room 2014, Moscone West Hall, 2nd Floor. We’re refining the final agenda, but in short it’ll be about digging deep into Unity 5, learning from Unity engineers and games developers.
Here’s what we’re preparing:

  • Graphics improvements by Aras Pranckevičius (Rendering Plumber),

  • New audio mixer by Jan Marguc and Wayne Johnson,

  • Future of scripting with IL2CPP by Jonathan Chambers (Scripting Team Developer) and Mantas Puida (iOS Team Lead),

  • Unity Ads & Everyplay by Oscar Clark (Everyplay Evangelist) and Nikkolai Davenport (Dev Relations Engineer),

  • Cloud Build, Analytics and more services by Patrick Curry, John Cheng & Suhail Dutta,

  • Post mortem on producing high-end content by Veselin Efremov (Artist), Torbjorn Laedre (GFX Programmer), Dominic Laflamme (Lead Developer Storytelling),

  • And finally “Rebuilding Republique in Unity 5” postmortem by Camouflaj team Paul Alexander (Producer/Designer), Kevin Call (Engineer) and Stephen Hauer (Art Director)

Unity GDC 2015 Expo Booth – South Hall #1402

As usual, we’re setting up shop in the main expo of GDC. We encourage you to stop by and say hello, ask questions, and check out awesome games from the community and our great partners that make the Unity development ecosystem so amazing.  We’ll have a lot of staff at the booth ready to answer questions and give you tours of Unity including the big new features of Unity 5 and some awesome new demos. We’ll also have daily drawings to win cool prizes

Talks at Unity booth

Additionally, we’ve got a slew of useful talks scheduled from Unity staff and partners designed to help you get the most out of Unity. We will be announcing the program shortly, and of course you can stop by the booth to see the schedule!

Games Pavilion

As usual, we’re very excited to be hosting several games currently in development by the Unity community of developers at the Unity booth. Come by, check out the games, talk to the guys that made them, and get inspired!

Republique Remastered from Camouflaj Ori and the Blind Forest from Moon Studios Gang Beasts from Bone Loaf Mordheim: City of the Damned by Rogue Factor Total War Battles Kingdom from Creative Assembly Space Noir from N-Fusion Super Dungeon Bros from React Games Dyscourse from Owlchemy Labs The Room Three from Fireproof Games The Trace from Relentless Pollen from Mindfield Games Armello from League of Geeks TBA from NVYVE Unkilled by Madfinger Games Wind Up Knight 2 Wii U by Robot Invaders

and two other unannounced titles!

Partner Pavilion

Make sure to stop by and check out the latest technologies and platforms being showcased by our sponsors in the partner pavilion:

Microsoft
image02
image03
image04
image05
image07
image06
image09
image08
image10
image00

This year we’re also pleased to have a new dedicated area for our Asset Store Publishers! The following twelve publishers will be showcasing their tools and technologies for one day during the show, rotating across three kiosks:

Cinema Suite, Houdini, Hutong Games/PlayMaker, Make Code Now, Neat Corporation/Shader Forge, Owlchemy Labs, Polygonmaker, ProCore, Rust Ltd, SonicEther Technologies, TextMesh Pro and Tigar Shark Studios.

February 26

Join Xamarin for GSMA Mobile World Congress 2015

Mobile World Congress LogoXamarin will take the stage alongside Airwatch, Box and Salesforce in Barcelona at Mobile World Congress next week.
 
Xamarin’s Director of Enterprise Mobility, Steve Hall, will join the Steve Hall“Airwatch Presents a New Standard for Enterprise App Development” panel discussion on March 2nd. Employees expect – and need – fast, on-the-go access to company data, and we’ll share how enterprises can successfully build and distribute secure mobile apps.


AirWatch Presents a New Standard for Enterprise App Development featuring Box, Salesforce & Xamarin

Tuesday, March 3, 1:30 – 2:25 pm CET
AirWatch Connect Stand
Hall 3, Stand 3D10
All MWC Attendees are welcome to attend.

See you in Barcelona!

Using history to better explain branch differences

Release BL647 introduced a great step ahead in the way in which branch (and cset) differences are displayed and explained. Now it is possible to understandwhere each difference comes from:

Remastering Republique: The Journey to Unity 5

Greetings, fellow Unity developers! We are Camouflaj, a game studio based near Seattle, WA. We are the folks behind République, an episodic stealth action game about governmental surveillance. To date, we’ve shipped three episodes (of five) to an overwhelmingly positive reception.

Back in 2012, we promised to make a “true” PC & Mac version of République that was in no way a simple mobile port. We spent countless hours thinking and experimenting with ways to make our upcoming PC & Mac release all the more special.

Soon after Unity 5 was announced, the team started dreaming up ways we could use that new technology to make a big splash on PC. We wanted to totally remaster the game in Unity 5.

That’s when we approached Unity with a proposal: in exchange for early access to Unity 5’s alpha and beta releases, why doesn’t the Camouflaj team document their journey from Unity 4 to Unity 5? We’d love to leverage République as a standout title on Unity 5, and share the story of our development with the public so they can learn from our successes and failures. Thankfully, the folks at Unity said yes.

Today we are proud to share the developer diary about our journey to remaster Republique in Unity 5!

Each of the five episodes from our dev diary includes a video and a podcast. Ultimately, our modest hope is that our “journey” series is helpful to you.

Here’s a more detailed breakdown of what we cover:

Dev diary 1: République enters the Next Gen

Why we are using Unity 5 to take Republique to PC

Dev diary 2: République Migrates From Unity 4

We’ll explain how we moved our project from Unity 4 to Unity 5 in the midst of a chaotic, 20-person project, starting with our initial investigation.

Dev diary 3: République in Physically Based Shading

This is the really exciting stuff. We’ll go over a little bit of Physically Based Shading for the uninitiated, and explain how we put this to work in our game.

Dev diary 4: République Lighting & More

We’ll walk you through how we made use of Reflection Probes, Global Illumination, cookies and other good stuff, plus we’ll cover some physics and animation refinements.

Dev diary 5: République Ships on Unity 5

We’ll document our push to launch, and how we optimized and (fingers crossed) shipped a fantastic game.

Thank you for taking this journey with us!

-Camouflaj

February 25

Triggers in Xamarin.Forms

Triggers were introduced in Xamarin.Forms 1.3 along with Behaviors, which we covered previously. Triggers allow you to declaratively express actions in XAML that are executed when a specified condition is met. Xamarin.Forms support four types of triggers:

  • Property Trigger – executed when a property on a control is set to a particular value.
  • Data Trigger – same as the property trigger but uses data binding.
  • Event Trigger – occurs when an event occurs on the control.
  • Multi Trigger – allows multiple trigger conditions to be set before an action occurs.

Let’s take a look at each one in detail.

Property Trigger

Property Triggers (represented by the Trigger element) are added to a control’s Triggers collection. The Setter collection inside is executed when a specified property equals the specified value.

PropertyTrigger

Wouldn’t it be nice to provide some visual indicator that an input control has focus? To achieve this, we can set the BackgroundColor property when the property IsFocused of the Entry element is true.

<Entry Placeholder="enter name">
    <Entry.Triggers>
        <Trigger TargetType="Entry"
             Property="IsFocused" Value="True">
            <Setter
                Property="BackgroundColor"
                Value="Yellow" />
        </Trigger>
    </Entry.Triggers>
</Entry>

Alternatively, we can set them in styles so that they can be attached to every Entry element in the screen.

<ContentPage.Resources>
   <ResourceDictionary>
     <Style TargetType="Entry">
       <Setter Property="AnchorX" Value="0" />
       <Style.Triggers>
         <Trigger  TargetType="Entry"
                   Property="IsFocused"
                   Value="True">
           <Setter Property="BackgroundColor"
                   Value="Yellow" />
         </Trigger>
       </Style.Triggers>
     </Style>
   </ResourceDictionary>
</ContentPage.Resources>

Data Trigger

DataTriggers are very similar to PropertyTriggers, except that instead of specifying the Property, we specify the Binding for the trigger. This Binding generally refers to another VisualElement’s property on the page or it could reference a property in a ViewModel.

The code below shows how to disable the button when the entry’s Text.Length property is 0.

<StackLayout Spacing="20">
<Entry x:Name="emailAddress" Text="" Placeholder="email address"/>
<Button Text="Send">
  <Button.Triggers>
    <DataTrigger TargetType="Button"
         Binding="{Binding Source={x:Reference emailAddress},
                                           Path=Text.Length}"
         Value="0">
      <Setter Property="IsEnabled" Value="False" />
    </DataTrigger>
  </Button.Triggers>
</Button>
</StackLayout>

Event Trigger

Event Triggers execute user-defined code when a specified event occurs.

In the above Property Trigger example, we saw how to change the background color of an Entry element based on the IsFocused property entirely in XAML. Alternatively, we can use an Event Trigger to execute an action written in C# based on the TextChanged event of an entry to perform some basic validation.

Define the TriggerAction in code

Every action that we define has to inherit from TriggerAction<T> where T is the element to which a trigger is attached. When a trigger is fired, the Invoke method will be called. In the code below, we change the Entry’s BackgroundColor to indicate whether the input is valid or not.

public class NumericValidationTriggerAction : TriggerAction<Entry>
{
   protected override void Invoke (Entry entry)
   {
      double result;
      bool isValid = Double.TryParse (entry.Text, out result);
      entry.BackgroundColor =
            isValid ? Color.Default : Color.Red;
   }
}

TriggerAction in XAML

To use the C# code, just declare a namespace for the assembly (xmlns:local in this sample) and add the NumericValidationTriggerAction element to the event trigger:

<Style TargetType="Entry">
<Style.Triggers>
    <EventTrigger Event="TextChanged">
        <local:NumericValidationTriggerAction />
    </EventTrigger>
</Style.Triggers>
</Style>

Multi Trigger

A MultiTrigger looks similar to a Trigger or DataTrigger except there can be more than one condition. All the conditions must be true before the Setters are triggered.

In the code below, we enable the button when either the email or the phone entries are filled in by the user. Each condition is true when the length of the text input is zero (ie. nothing has been entered). When both conditions are true (ie. both are empty) then the trigger’s Setters are called, which in this case disables the button. When either have text entered, the overall condition becomes false and the button is enabled.

<Style TargetType="Button">
<Style.Triggers>
  <MultiTrigger TargetType="Button">
    <MultiTrigger.Conditions>
      <BindingCondition
          Binding="{Binding Source={x:Reference email},
                            Path=Text.Length}"
          Value="0" />
      <BindingCondition
          Binding="{Binding Source={x:Reference phone},
                            Path=Text.Length}"
          Value="0" />
    </MultiTrigger.Conditions>
    <Setter Property="IsEnabled" Value="False" />
  </MultiTrigger>
</Style.Triggers>
</Style>

To see how to build a “require all” trigger (like you’d use in a login page, for example) check out our Triggers sample on GitHub that uses an IValueConverter along with a MultiTrigger.

For even more information on Xamarin.Forms, be sure to check out the detailed documentation.

Discuss this post in the Xamarin forums

How to setup an encrypted server

We are happy to announce a new Plastic SCM feature that allows configuring a server with encrypted data.

It means that, in your organization, you can configure a central server where all the data is encrypted. This way, the users who have a specific key, will be able to push/pull data to this server.

It is important to remark that this server is created for replication purposes only. Repositories have all data encrypted. If we directly download the file content to a workspace, we will only see empty files (the data is encrypted).

This configuration could be very useful when your server is accessible from a public network or you just need to be sure that even if a not authorized person access to your server, he will not be able get any information.

February 24

Live APAC Webinar: Go Mobile with Xamarin

Photo of Mayur Tendulkar

Join Xamarin Evangelist Mayur Tendulkar for this live webinar timed just for our APAC customers, where you’ll learn how to leverage your existing Microsoft .NET and C# skills to build iOS, Android, and Windows Phone apps using Visual Studio and Xamarin. We’ll also talk about how to maximize code sharing and reuse existing .NET libraries.

At the end of the webinar, you’ll have the skills you need to create your first iOS and Android apps in C# with Xamarin in Visual Studio.

Wednesday, March 11
11:30 AM – 12:30 PM IST

Register

All registrants will receive a copy of the webinar, so please feel free to register even if you can’t attend.

Gorgeous Arch-Viz in Unity 5

Is it possible to dial up the quality level in Unity 5 high enough to make high-end architectural visualizations?

In response Alex Lovett aka @heliosdoublesix built this gorgeous architectural visualization demo in Unity 5.

It makes good use of the real-time global illumination feature, physically based shading, reflection probes , HDR environment lighting, the linear lighting pipeline and a slew of post-effects all in order to achieve the necessary visual fidelity expected in an architectural visualization.

The aim was to push for quality, so very high resolution textures were used and the model has just over 1 million faces.

There is no baked lighting in this scene

The first part of the demo has a fast moving sun. The second part has more localized lighting; a spot light from a fellow maintenance robot lights up the environment in addition to the headlight of the robot the viewer is piloting. In both parts there is considerable environment lighting.

Due to how the scene is laid out, there is a lot of bounced lighting and also quite distinct penumbrae caused by indirect lighting. For example, the v-shaped columns cast a very sharply defined indirect shadow onto the ceiling, which is especially visible in the night time part of the video.

Front-Huge 28 Under-16-10 Side Redo1 Redo4 Redo5 TopShadow-16-10 Reflection probes TopShadow2-16-10 Night5-Top Shadow1 Night4-Corner Indirect shadow penumbrae Indirect shadow penumbrae

Using high resolution real-time lightmaps

When the lighting changes, these penumbrae and the overall lighting gradients have to change significantly. In order to do this with global illumination, the Enlighten powered real-time lightmaps feature was employed. Traditionally, Enlighten is used in-game at relatively low resolutions (1-2 pixels per meter). This works well because the bounced lighting is generally quite low-frequency.

In this demo, a much higher density is used to capture the fine details in the lighting. An overall density of 5 pixels per meter was used. There is about 1.5 million texels in the real-time lightmaps in total. In the resolution screenshot below you get a sense of the density in relation to the scene size.

At this resolution, the precompute time spent was about 2.5 hrs. The scene is automatically split into systems in order to make the precompute phase parallelizable. This particular level was split into 261 systems. The critical path through the precompute (i.e. the sum of the most expensive job in each stage along the pipeline) is about 6 minutes. So there are significant gains to be made by making the precompute distributed. And indeed going forward, one of the things we will address is distribution of the GI pipeline across multiple computers and in the cloud. We will look into this early in the 5.x cycle.

See geometry, GI systems and real-time lightmap UV charting screenshots from the Editor below:

Geometry Real-time lightmap systems Real-time lightmap texture density and UV charts

Interactive lighting workflow

Once the precompute is done, the lighting can be tweaked interactively. Lights can freely be animated, added, removed and so on. The same goes for emissive properties and HDR environment lighting. This demo had two lighting rigs; one for the day time and one for the night time. They were driven from the same precompute data.

“I’m able to move the sun / time of day and change material colors without having to rebake anything. I can play with it in real-time and try combinations out. For a designer like me, working iteratively is not only easier and faster, but also more fun,” says Alex Lovett.

Lighting 1.5 million texels with Enlighten from scratch takes less than a second. And the lighting frame rate is decoupled from the rendering loop, so it will not affect the actual rendering frame rate. This was a huge workflow benefit for this project. Interactive tweaking of the lighting across the animation without interruption drove up the final quality.

To make this a real-time demo, some rudimentary scheduling of updating the individual charts would have to be added, such that visible charts are updated at real-time, while occluded charts and charts in the distance are updated less aggressively. We will look into this early in the 5.x cycle.

Acknowledgements

A big thanks to Alex Lovett owner of shadowood.uk. He has been tirelessly stress testing the GI workflow from when it was in alpha. Also thanks to the Geomerics folks, especially Roland Kuck.

The following Asset Store items were used. SE Natural Bloom & Dirty Lens by Sonic Ether, Amplify Motion and Amplify Color by Amplify Creations.

Web continuous integration with Plastic SCM and Azure

Microsoft defines Azure Websites as a fully managed Platform-as-a-Service (PaaS) that enables you to build, deploy and scale enterprise-grade web Apps in seconds. Since most modern web development teams promote new code from staging to production environments, it is important to consider various techniques to automatically deploy your code as part of your ALM (Application Lifecycle Management) process. The technique that we will focus on in this article is Plastic SCM with GitSync. With this technique, your team can quickly and easily publish changes automatically to Azure Websites from GitHub.

February 23

Adding Real-world Context with Estimote Beacons and Stickers

It’s no secret that iBeacons have created a buzz in the development community. Leveraging these Bluetooth Smart devices enables developers to add contextual awareness to their mobile apps with just a few lines of code. iBeacons were everywhere at Evolve 2014, including at the forefront of the Evolve Quest scavenger hunt and the conference mini-hacks, as well as taking the main stage for an in-depth session.

sticker_bikeEstimote, a leader in the iBeacon space, recently introduced Estimote Stickers, a low-powered device to go alongside their traditional beacons. Stickers can be attached to almost anything and turn any everyday item into a “nearable” – a smart object that can transmit data about its location, motion, temperature, and environment to nearby apps and devices. Today, we’re pleased to announce the Estimote SDK for iOS, available on the Xamarin Component Store, enabling developers to easily detect Beacons and Estimote Stickers with a beautiful C# API that includes events and async/await support.

Detecting Nearables

Nearables have a new, simplified API. Each Nearable has a specific NearableType that can be used to detect, for example, Car, Dog, or Bike. You can decide to range for a specific type or all nearby Nearable devices.

Let’s see how easy it is to get up and running with Nearables by scanning for all Nearables that are close by.

Install the Estimote SDK for iOS

The very first task is to set up a new Xamarin.iOS project and add the Estimote SDK for iOS from the component store.

2015-02-19_1313

In addition to the SDK, you must specify NSLocationAlwaysUsageDescription or NSLocationWhenInUseUsageDescription in your Info.plst file with a description that will be prompted to your users, since iBeacons use CoreLocation functionality.

Setting Up App ID

When you log in to your Estimote Cloud, you are able to manage all of your Beacons and Stickers in addition to creating API keys for your mobile apps. Once you have an app set up in the Estimote Cloud, you can config the app in your AppDelegate’s OnFinishedLaunching method:

Config.SetupAppID ("<appId from cloud>", "<appToken from cloud>");

While not required, it’s recommended to set up your app ID so that the SDK can now communicate with the Estimote Cloud to pull in unique attributes.

Ranging Nearables

Using the new NearableManager you can easily range for Nearables by subscribing to the RangedNearables event.

NearableManager manager;
public override void ViewDidLoad ()
{
  base.ViewDidLoad ();
  manager = new NearableManager ();
  manager.RangedNearables += (sender, e) => {
    //Nearables detected, load into TableView or pop up alert
    new UIAlertView("Nearables Found", "Just found: " + e.Nearables.Length + " nearables.", null, "OK").Show();
  };
  //Specify the type of Nearable to range for. In this instance return All types.
  manager.StartRanging (NearableType.All);
}

Estimote Nearables Detected

The real power of Nearables is the additional attributes that are received when they are detected, such as their temperature, orientation, acceleration, and more. As an example, you could easily use these attributes to detect a Bike Nearable in motion for over 45 minutes and prompt your user to perhaps take a break.

NearableManager nearableManager;
public override void ViewDidLoad ()
{
  var identifier = "94064be7a9d7c189"; //Identifier ranged earlier
  var durationThreshold = 45 * 60; //45 minutes
  nearableManager = new NearableManager ();
  nearableManager.RangedNearable += (sender, e) => {
    var bike = e.Nearable;
    if(bike.IsMoving && bike.CurrentMotionStateDuration > durationThreshold) {
      Console.WriteLine("Bike is moving and has been in motion for over 45 minutes!");
    }
  };
  nearableManager.StartRanging(identifier);
}

Triggers and Rules

In addition to ranging and monitoring Nearables, there is an advanced trigger system in the Estimote SDK that enables you to specify several rules that would trigger a notification. Let’s say you want to be notified every time a Nearable changes orientation and is laid down in a horizontal position. You would simply create a OrientationRule and use the TriggerManager to wait for the Nearable’s state to change.

TriggerManager triggerManager;
public override void ViewDidLoad ()
{
  var rule = OrientationRule.OrientationEquals (NearableOrientation.Horizontal, NearableType.Shoe);
  var trigger = new Trigger (new Rule[]{ rule }, "TriggerId");
  triggerManager = new TriggerManager ();
  triggerManager.StartMonitoring (trigger);
  triggerManager.ChangedState += HandleTriggerChangedState;
  triggerManager.ChangedState += (sender, e) => {
    Console.Log("Shoe nearable has been placed horizontal");
  };
}

More complex rules can be configured that are based on DateTime, temperature, proximity, and more.

Enhanced C# Beacon API

Xamarin.iOS has been able to detect iBeacons from any vendor since the feature was introduced in iOS 7 in CoreLocation. However, the Estimote SDK greatly simplifies the tasks of requesting and a simplified API for ranging and monitoring for beacons. In addition, if you are using Estimote Beacons you can tap into advanced features, such as their accelerometer.

0001125_estimote-beacons

Learn More

The Estimote SDK for iOS has plenty of great samples for both Nearables and Beacons for you to start out with, including a full Getting Started Guide. In addition, Estimote has SDK reference documentation and a developer portal with more information.

If you are interested in adding iBeacon functionality to your Xamarin.Android apps, be sure to check the component store for multiple libraries that you can take advantage of.

Discuss this post on the Xamarin Forums.

February 21

Apple Watch Kit round-up

It's Saturday, a good excuse for a 'fun' post. Here's a little collection of tidbits about the Apple Watch...



Apple: Watch Kit - if you're thinking of developing for the platform, might as well start at the source :)

Wareable: The best Apple Watch apps... - some great screenshots of apps already being built, including Clear, BMW, and Nike. It's interesting to see the UI design approach being taken by different developers. Check out the similar list on ibtimes.com

FastCompany: How the Apple Watch will work... - a couple of thoughts on app design, and screenshots of Todoist.

eleks labs' unofficial Tesla app - more design thoughts and prototype video (unofficial development, not affiliated with Tesla)..

Daring Fireball: On the Pricing of the Apple Watch - so yeah, "starting at $349" sounds like it's going to be the understatement of the year.

WatchKit FAQ - Awesome collection of questions and answers (and cute watch drawings too).

MartianCraft: Designing for the Apple Watch with Briefs - even if you don't use the tool (which looks great) this is a lovely post on Watch app design.

If that's got you interested in building apps for the Apple Watch, it's time to check out Xamarin's Watch Kit Preview and how to get started (inc video) and my first watch app.



I've also got a couple of samples, including Magic 8 Ball, Calculator, Insta, and Todo for you to try.

^ watch frame screenshots generated with Bezel thanks to the fine folks at infinitapps.

February 20

Nordic Game Jam 2015

A couple of weeks ago, several of us from the Unity Copenhagen office took part in the Nordic Game Jam . With around 730 participants, it’s probably the largest game jam in Europe. I’d been told in the past that I absolutely had to try this, but all the other game jams I’ve been to before were much smaller, so I didn’t know what to expect. Here’s what went down!

People from different countries flew in to Copenhagen for the two day jam that took place at Aalborg University, which consists of these two enormous buildings by the water connected by a bridge. The view from the bridge was beautiful and great to catch the sunrise from on a clear day!

IMG_2437 Polish invasion! Live DJ set at NGJ pre-party

While the actual jam kicked off on a Friday, we started getting into Nordic Game Jam mode the day before. A large group of game devs from Poland came to visit us at the Unity Copenhagen office, what later became known as the Polish Invasion. After a day of hanging out, we gathered the troop and went to the NGJ pre-party in Christiania where lots of dancing, playing indie games like Progress and catching up with friends took place. There were a couple of game journalists accompanying the game devs and they wrote a nice piece about their visit to our office.

Once in place at Aalborg University for NGJ, we set up a booth where participants could stop by and chat with our HR manager Anders about landing a job at Unity and get temporary Unity tattoos! NGJ interviewed Anders about working at Unity.

IMG_2231 (1) IMG_2234 IMG_2695

While part of the audience may have been a bit tired after the pre-party, you could sense the atmosphere of excitement the next day. I gathered a group of friends from Sweden, Poland and Germany, which turned out to be a really cool team.

The theme of the game jam, “OBVIOUS”,  was revealed after a day filled with talks, including one from James Portnow of Extra Credits and a keynote from Steve Swink. What’s pretty cool about game jams in general is that anyone can participate, whether you have created several games or are completely new to game development. Extra Credits recently worked with us on a series of videos about getting started with game development and I believe game jams to some degree fill the same purpose.

IMG_2263 IMG_2440 IMG_2241

Everyone split up into their groups, moved into rooms or spaces for dev:ing, started brainstorming game ideas and seeing what skills everyone had that could be put to use. You could hear lively discussions going on and feel the atmosphere of creative minds interchanging genius thoughts. Different groups had different methods of getting their thought processes going, pinning googly eyes on pineapples and spontaneous dancing took place.

The NGJ organisers made sure any type of game could be created during the jam. There was equipment for creating arcade games, material for board games, 3D printers, Oculus DK2’s, joysticks, and so on. The best part was being able to use the sound lab, which looked insane when I walked in the first time, a dark room covered in enormous spikes pointing directly at you, so quiet I could hear my own thoughts. I could use this room when performing the voice acting for the role of a pregnant woman in our game, which was a pretty interesting experience as well. Getting to scream as loud as I could in a room all by myself is not an everyday activity. My voice did however take quite a beating and I was still recovering the week after from a sore throat. Totally worth it.

IMG_2439

Though there were teams that got started on their games on Friday night, my team first decided on a specific game idea during our Saturday morning meeting. It really was a matter of “our deadline to decide on something is before lunchtime and after that we work work work.” And so we did. Feeling confident, we all popped back up into the space we’d taken over the night before and started producing. We split the areas up well having quite a large group, so each person was able to dedicate their time to art, code, design and audio. The group sizes at NGJ varied from 2 person teams to 6 persons. A few lovely souls also jumped between teams to help out in any area they could, which was super awesome.

Several groups stayed up late or pulled all-nighters to finish up their games for Sunday’s submission deadline. I believe around 140 games were submitted, so presentations took place in separate rooms where participants were able to vote for their favourite. Several of the games were made using Unity and you can play many of them on NGJ15′s itch.io site. One of the best things about game jams is that you never know the outcome of what people are working on, the projects typically start out pretty comprehensible, but can quickly turn ridiculous, which makes them that much more memorable.

A ceremony was held after the absolute final voting had taken place and the jury had made their decisions on which games were the best in each category. Awards were handed out, speeches were given, songs were sung and everyone was happy with the results. A great game jam, making new friends and just having a swell time is the summary of the weekend.

But before you stop reading, check out a couple of my favourite submissions:

Look at my drawing

Screen Shot 2015-02-18 at 17.51.07

Press F to Win

Screen Shot 2015-02-18 at 17.46.27

Hest til fest

Screen Shot 2015-02-18 at 19.48.36

Double Trouble

SnRsjC

There were many more good games, you can find a complete list and play some on NGJ15′s itch.io site.

Here are a couple of games made by the teams that included some Unity folks:

Gone

oaQHaC

Express Delivery

Screen Shot 2015-02-18 at 17.42.05

Black Hole Battle: not #madewithunity, but a board game instead!

IMG_2442

Once again, a big thank you to the organisers for creating such a fun and memorable event, we look forward to participating next year which also happens to be NGJ’s 10 year anniversary!

February 19

Unity 4.6.3: Metal rendering support and update of IL2CPP for iOS

Today we shipped the public release of Unity 4.6.3. You can get it on our download pageWith this release, we’re bringing iOS Metal rendering support to Unity 4.x. Unity 4.6.3 is the first Unity 4.x public version supporting both critical features in the iOS world: iOS 64 bit via IL2CPP and Metal rendering. Unity 4.6.3 also brings critical updates to IL2CPP for iOS 64 bit.

What is Metal rendering ?

It is a new low-level rendering API developed by Apple for iOS 8 and further. It focuses on doing less in GPU drivers, so the CPU overhead while making Metal calls is minimal. This way, games can consume less CPU time and can do more fancy stuff in the remaining freed up time.

Here’s a short description from Apple:

“Metal provides the lowest-overhead access to the GPU, enabling you to maximize the graphics and compute potential of your iOS 8 app. With a streamlined API, precompiled shaders, and support for efficient multi-threading, Metal can take your game or graphics app to the next level of performance and capability.”

For more information, please consult the official Apple Metal rendering developer site.

How to enable Metal rendering ?

To bring Metal support, Unity takes care of most of the things that happen behind the scenes. Metal will be used by default on capable devices. If you want more control, you can find Graphics API selector in Player Settings; with values like Automatic, Metal, OpenGL ES 3.0, OpenGL ES 2.0:

If you want to detect whether you’re running on Metal at runtime, do something like if (SystemInfo.graphicsDeviceVersion.StartsWith(“Metal”)).

We worked really hard to make Metal usage as seamless as possible, but please report issues if you run into them!

Update of IL2CPP on iOS-64 bit

Unity 4.6.3 is a critical update to IL2CPP on iOS-64 bit:

  • Fifty fixes were made for various bugs and crashes. We are very grateful for your feedback which enabled us to move and iterate fast.

  • Missing support of .NET classes was added for ThreadPool, Asynchronous Sockets, WebRequest.

  • Added support for async delegates (BeginInvoke/EndInvoke).

We are committed to fixing and improving IL2CPP support for iOS-64 bit in further Unity 4.6.x patches and public releases as well as in Unity 5, so if you have any issues, do not hesitate to report those and ping us on the forums.

Other goodies

Unity 4.6.3 release is not limited only to Metal rendering or IL2CPP on iOS. It has number of fixes and improvements to Android, iOS, 2D, animation, shaders, UI and others. For a full list of changes, please consult the release notes.

Code Sharing Strategies for iOS & Mac

I fell in love with the Xamarin approach to mobile development because sharing code across platforms provided me with huge productivity gains when developing apps. With Xamarin you can share an average of 75% of app code across all mobile platforms and in this blog post, I’m going to give you some strategies to help you share even more code between iOS and OS X. If you’ve recently developed an app for iPhone or iPad using the traditional approach (sans Xamarin.Forms),  you might be surprised to learn how much code you can share between iOS and OS X. With the Mac platform becoming increasingly popular, there’s never been a better time to consider if your apps could benefit from targeting a new platform. Let’s learn some tips and tricks for sharing more code between the platform all in the context of C#.

codeshare

General Code Sharing Strategies

It’s common knowledge that both iOS and OS X share a common architecture, which results in a great code sharing story. Many classes are compatible on both platforms without modification and with the recent release of the Unified API, we’ve made it even easier to share code between OS X and iOS.

Before we get started on linking all of our existing Xamarin.iOS code in a new Xamarin.Mac project, we need to first look at what we should share and what should remain platform dependent. The most common architectural pattern for iOS and OS X development is the Model View Controller (MVC) pattern. This increases the amount of code reuse in our app as many of the models and controllers will still be relevant regardless of the underlying platform.

Conditional Compilation (Shimming)

Sharing our view code is a little more involved, but is possible with a couple of techniques used by Apple in apps such as Keynote. Your existing iOS apps will be using the UIKit namespace, which is the framework that provides the window and view architecture needed to manage your app’s user interface. Here you will find labels, buttons, colors and other classes that you’ll build your app’s UI with. UIKit is only available for iOS, which means any code utilizing this framework will not run on the Mac without some modification. Let’s take a look at the simple sample problem of sharing colors between platforms.

iOS

var xamarinBlue = UIColor.FromRGB(0.26f, 0.83f, 0.31f);

Mac

var xamarinBlue = NSColor.FromCalibratedRgba(0.26f, 0.83f, 0.31f, 1f);

The above example is fairly consistent with what I find when looking at customers’ projects. With a little bit of trickery, we can share colors between platforms. Apple calls this “shimming” but you might know it as “conditional compilation”.

public class Color
{
 #if __MAC__
 public static NSColor FromRGB(nfloat r, nfloat g, nfloat b)
 {
 return NSColor.FromCalibratedRgba(r, g, b, 1f);
 }
 #endif
 #if __IOS__
 public static UIColor FromRGB(nfloat r, nfloat g, nfloat b)
 {
 return UIColor.FromRGB(r,g,b);
 }
 #endif
}

Now in both our iOS and OS X app, we can use the following to create our blue color.

var xamarinBlue = Color.FromRGB(0.26f, 0.83f, 0.31f);

If this is running on iOS, it will return a UIColor and on OS X we will get an NSColor. This is an approach I apply in many areas of UIKit and AppKit. You could for example extend this to UIImage and NSImage.

public static CGImage ToCGImage(string imageName)
{
 #if __MAC__
 return NSImage.ImageNamed(imageName).CGImage;
 #endif
 #if __IOS__
 return UIImage.FromFile(imageName).CGImage;
 #endif
}

Sharing Your UI with CALayers

If you want to maximize your code sharing, then you might want to investigate using CALayers. UIViews are built on CALayers, which can be accessed by using the Layer property of the UIView. The benefit of using CALayers over UIView is that CALayers can very easily be ported to OS X and there is no performance loss over using UIViews or NSViews. Apple’s Keynote canvas uses CALayers which allows them to share over 1m LOC between OS X and iOS.

In the example below, I’ve inherited from a CALayer and overridden the DrawInContext method to get the view setup. I set the background color, using my shimming method, to be purple. I then override the HitTest, which allows me to respond to touch or click events. In this sample, I want to change the background color of the layer every time the user interacts with it. Despite being a basic example, this code works on both iOS and OS X without any modification.

public class ColorChanger : CALayer
{
 public override void DrawInContext(CGContext ctx)
 {
   base.DrawInContext(ctx);
   BackgroundColor = Color.FromRGB(0.65f, 0.22f, 0.72f).CGColor;
   count = 0;
   this.Contents = Image.ToCGImage("xamagon.png");
 }
 public override CALayer HitTest(CGPoint p)
 {
   switch (count)
   {
     case 0:
       BackgroundColor = BackgroundColor = Color.FromRGB(0.2f, 0.52f, 0.89f).CGColor;
       count++;
       break;
     case 1:
       BackgroundColor = BackgroundColor = Color.FromRGB(0.26f, 0.83f, 0.31f).CGColor;
       count++;
       break;
     case 2:
       BackgroundColor = BackgroundColor = Color.FromRGB(0.65f, 0.22f, 0.72f).CGColor;
       count = 0;
       break;
  }
  return base.HitTest(p);
 }
 int count = 0;
}

Conclusion

With Xamarin, you’ve always been able to share approximately 75% of your code between the different platforms, and now with the above tips you can share even more. If you’re looking for basic drawing between platforms, you may find Frank Krueger’s CrossGraphics library useful, as it allows for drawing graphics on Android, iOS, Mac, Windows, and ASP.Net using .Net.

February 18

Xamarin App Video Spotlight: Curse Inc.

At Xamarin Evolve 2014, I had the opportunity to speak with Xamarin customer Curse, a multimedia technology company that builds websites and software for gamers. With 50 million users on their websites and 6 million users on their desktop client, Curse turned to Xamarin to help them build out Mac, Android, and iOS apps for their new product, Curse Voice.

Watch the video below to get a better understanding of how the Curse team was able to take their existing Windows code and get their innovative Curse Voice apps up and running quickly on Mac, iOS, and Android with Xamarin.

Learn More

Try out Curse Voice, from Curse, here.

To get started developing with the Xamarin platform, check out our developer documentation, or get live online training with Xamarin University.

Working with Physically-Based Shading: a Practical Approach

Throughout the development of Unity 5, we’ve used our Viking Village project internally as a testing ground for shading and lighting workflows.

If you’re using the Unity 5 beta, you can download the Viking Village package from the Asset Store to get insights into how you can assemble and illuminate a scene in Unity 5. We also present some of our learnings below.

Creating a template environment

In order to ensure that your texturing and shader configuration is behaving appropriately, we recommend that you use a simple scene with a variety of lighting setups. This could mean differing skyboxes, lights etc – anything that contributes to illuminating your model.

When you open Unity 5, you’ll notice that any new empty scene has a procedural sky as well as default ambient and reflection settings. This provides a suitable starting point.

lightcond5fr640x480loop.1

For our template environment we used:

  • HDR camera rendering

  • A few scattered reflection probes (for localized reflections on objects)

  • A group of light-probes

  • A set of HDR sky-textures and materials, as well as procedural skies. The sky which ships with this project was custom-made for Unity by Bob Groothuis, author of Dutch Skies 360.

  • Off-white directional lights with matched intensity and HDR sky color

Adjusting sky texture panoramas

Most sky textures include the sun (along with flares etc.), thus, light from the sun gets reflected by surfaces. This has the potential to cause three issues:

1) The Directional light you use to represent the sun must match the exact direction of the sun painted onto the skybox or there will be multiple specular hotspots on the material.

2) The reflected sun and the specular hotspot overlap, causing intense specular highlights.

3) The baked-in sun reflection is not occluded when the surface is in shadow and it becomes overly shiny in darkness.

image11

The sun is erased from the sky texture and re-added using a directional light and a lens flare.

As a result, the sun highlight, flares, sunrays and HDR values need to be edited out of the sky texture and reapplied using Directional Lights.

Authoring physically-based shading materials

To avoid the guesswork involved in emulating real world materials, it is useful to follow a reliable known reference.The Standard Shader supports both a Specular Color and a Metallic workflow. They both define the color of the reflections leaving the surface. In the Specular workflow, color is specified directly, whilst in the Metallic workflow, the color is derived from a combination of the diffuse color and the metallic value set in the Standard Shader controls

For the Viking Village project, we used the Standard Shader’s Specular Color Workflow. Our calibration scene, which you can download from the Asset Store, includes some handy calibration charts. We referenced the charts regularly when designing our materials.

When approaching materials you can choose between what we call the Specular and the Metallic workflows, each with its own set of values and a reference chart. In the Specular workflow you choose the color of the specularly reflected light directly, in the metallic workflow you choose if the material behaves like a metal when it is illuminated.

The specular value chart:

UnitySpecularChart

The metallic value chart:

UnityMetallicChartChoosing between Specular or Metallic workflows is largely a matter of personal preference, you can usually get the same result whichever workflow you choose to use.

Aside from charts and values, gathering samples of real world surfaces is highly valuable. It is of great help to find the surface type you are trying to imitate and try to get an understanding of how it reacts to light.

Setting up the material

When starting out, it’s often useful to create a plain but tweakable representation of the materials using colors, values and sliders derived from the calibration charts. Then, you can apply textures while keeping the original material as a reference to confirm that characteristics are preserved.

materialcompare_refl.1

Top row: untextured. Bottom row: textured. Left to right: Rock, Wood, Bone, Metal.

The traditional approach to creating textures

Textures in the Viking Village have been authored using both manual-traditional methods (photos + tweaking) as well as through scanned Diffuse/albedo, specular-, gloss and normal map images which were provided to us by Quixel.

Be careful when adding detail in the texture channels of the material. For example, it usually pays to avoid placing lighting (Ambient Occlusion, shadows etc.) in your textures: remember that the physically based rendering approach provides all the lighting you should need.

Naturally, retouching photographs is more demanding than using scanned data, specially when it comes to PBS-friendly values. There are tools that provide assistance to make the process easier, such as Quixel Suite and Allegorithmic Substance Painter.

Scanned data

PBS-calibrated scanned textures alleviate the need for editing, since data is already separated into channels and contains values for albedo, specular and smoothness. It is best if the software that provides the PBS-calibrated data contains a Unity profile for export. You can always use the reference charts as a sanity check and as a guide if you need to calibrate the values using Photoshop or a related tool.

Material examples

The Viking Village Scene features a large amount of content while trying to stay within reasonable texture memory consumption. Let’s take a look at how we set up a 10-meter-high wooden crane as an example.

Notice that many textures, especially specular and diffuse textures, are homogenous and require different resolutions.

image10Example1: This Crane object has 2 materials: 2 diffuse, 1 specular-smoothness, 2 occlusion and 2 detailed textures.

image06Example 2: The shield prop has 1 material: 1 diffuse, 1 specular-smoothness, 1 occlusion and no detailed textures.

Screen Shot 2015-02-17 at 17.12.23On the left: Crane Inspector for both materials. Rightmost is the shield prop material.

  • Albedo texture: In the specular workflow it represents the color of diffuse light bounced off the surface. It does not necessarily need to be highly detailed as seen in the left image (crane), whereas the right texture (shield) includes significant unique detail.

vv albedo texturePainted Crane Diffuse Map snippet with plain wooden color and intensity. Contains a modest amount of detail. Right image: Shield Diffuse texture with higher (ppi) unique detail.
image09Diffuse value (no texture) for crane material

  • Specular: Non-metals (insulators) are comparatively dark and in grayscale while metal values are bright and could be colored (remember that rust, oil and dirt on a metal are not metallic). Specular for the wood surface did not benefit extensively from a specular texture, so a value was used instead of inputting a map.

image07Crane Specular values for wood.

Screen Shot 2015-02-17 at 17.40.23Crane Specular map for metal (not using metallic shader). Right: Shield Specular texture.

  • Smoothness is a key element in PBS materials. It contributes variation, imperfections and detail to surfaces and helps represent their state and age.
    For the crane, smoothness happened to be fairly constant across the surface and was therefore substituted by a value. This delivered a reasonable texture memory gain.

image05Crane Smoothness values for wood. No textures required!

Screen Shot 2015-02-17 at 17.51.13Crane Smoothness map for metal (not using metallic shader). Right: Shield Smoothness map with mixed metal and wood surfaces.

  • Occlusion indicates how exposed different points of the surface are to the light of the surrounding environment. Ambient Occlusion brings out surface detail and depth by muting ambient and reflection in areas with little indirect light.
    Keep in mind that there’s also the option of using SSAO (Screen Space Ambient Occlusion) in your scene. Using SSAO and AO could result in double darkening of certain areas, in which case you may want to consider treating the AO map as a cavity map.
    An AO map that would emphasise deep cracks and creases may be the best option if the game uses SSAO and/or lightmapped Ambient Occlusion.

image01

1 Lightmapped AO, 2: Occlusion texture, 3: Occlusion in Diffuse, 4: Image effect SSAO 

Secondary Textures and resolution

Secondary Textures can be used to increase the level of detail or provide variation within the material. They can be masked using the Detail Mask property.

Due to the low resolution primary diffuse wood texture in the Crane example, the secondary texture set is crucial. It adds the fine detail to the final surface. In this instance, the detail-maps are tiled and at a reasonably low resolution. They are repeated on many other wooden surfaces, thus delivering a major texture memory saving.

image20Secondary albedo- and normal maps compensate for the low-resolution main diffuse and normal map. Both textures reduce overall texture memory by being widely “overlayed” and tiled on wooden surfaces throughout the village. Be cautious when providing lighting information to a diffuse detail map as it this has a similar effect to adding such information to primary diffuse.

image16Crane wooden surface with (left) and without (right) secondary texture maps.

These workflows certainly helped us when designing the Viking Village project. We hope you also find them useful, and look forward to reading your comments!

Acknowledgements

The Viking Village project was launched in partnership with the creative team at Quixel, developer of HDR surface capture technology and the Quixel Megascans library of PBS-ready textures.

Big thanks to the very talented Emmanuel “Manu” Tavares and Plamen “Paco” Tamnev for bringing this scene to life.

Go and download the project at the Asset Store. Be aware that it’s optimised for Unity 5.0.0 RC2. Pre-order customers and subscribers can download this beta version of Unity here, for Mac and Windows.

February 17

StepCounter Gets in Shape

Last year we announced the release of My StepCounter for iOS. Since then, the iOS landscape has changed considerably – HealthKit was announced, two new iPhones have been introduced, and the iPad now supports My StepCounter AppIcon the same API for CoreMotion as the iPhone.

The original app was designed purely with the iPhone 5s in mind, as this was the only supported hardware available at the time. With an increase in the number of devices that support the step counting API, I thought it was time to make an update to ensure the app works perfectly on these new devices.

While updating the app, I opted to migrate the user interface from Apple’s older Xib format to a single Storyboard. With this change, you’re now able to visualize how My StepCounter will look from within both Xamarin Studio and Visual Studio. The new approach is great for developers using Visual Studio, as they can minimize the amount of time interacting with Xcode on their Mac build host.

myStepCounteriOSStoryBoard

Not only does My StepCounter now support Storyboards, a number of images have been replaced with custom views drawn using code generated from PaintCode. The new change has cut down the number of artwork assets the app needs to ship with, and thus reduces the final binary size.

The benefit of this is huge, as it means the app looks great on any screen size without the binary size bloating from additional images. One of my favorite things about using a tool like PaintCode is that the control is live rendered within our storyboard designer so you can instantly see how your App will look in the designer without having to deploy to the simulator or device.

A few extra little additions to the app include integration with Xamarin Insights, a new share option so you can tweet or post your step count, and improved animations across the entire app.

All of the code is up on GitHub for you to download and explore today.

PlasticDrive – dynamic readonly workspaces as windows drives

PlasticDrive is a tool to mount a changeset as a Windows drive and let you quickly browse the code using your favorite tools (Visual Studio, Eclipse, IntelliJ…). Files are downloaded from the server on demand (then cached) so the mount happens immediately, no need to wait for a big update to finish.

Monologue

Monologue is a window into the world, work, and lives of the community members and developers that make up the Mono Project, which is a free cross-platform development environment used primarily on Linux.

If you would rather follow Monologue using a newsreader, we provide the following feed:

RSS 2.0 Feed

Monologue is powered by Mono and the Monologue software.

Bloggers