Problems with map movement and map zoom

Include the following details (edit as applicable):

  • Issue category: Lightship Maps
  • Device type & OS version: Android / iOS
  • Host machine & OS version: Windows
  • Issue Environment : On Device
  • Xcode version: Not installed
  • ARDK version: Not installed
  • Unity version: 2022.3.5f1
  • Lightship Maps version: 0.4.0

Hello Lightship Maps developers team and Lightship community! :grinning:

I am developing a game with AR Foundation on Unity for Android and iOS.
Probably in the future I will migrate to ARDK, at present I want to use Lightship Maps to show the position of the player and other elements.

Description of the issue:
I have placed a minimap in the screen following the instructions:

“A second camera, called “Map Camera”, is used to render the map to a render texture which is then displayed to the user on a RawImage canvas element.”

As explained in Render-to-Texture Sample

But when I open the scene I have 2 problems: movement and zoom.

Movement problem
When I play the Top-Down Camera Sample I can put my finger over a point on map, represented as a red point on the image, and move it over the screen. And my finger will be always over the same point on the map, as we can see on the image 1 (start) and 2 (finish).

But when I try to do the same over my map, there is a distance between the point in the map and my finger, as we can see on the image 3 (start) and image 4 (finish).

Zoom problem
When zooming in the Top-Down Camera Sample with two fingers, represented as a red point on the image, the zoom will be applied to the center between the two points, as we can see on the image 5 (start) and image 6 (finish).

But when I try to do the same over my map, there is a distance between the point in the map originally aimed and the result, as we can see on the image 7 (start) and image 8 (finish).


  1. How can I fix the “Movement problem”?
  2. How can I fix the “Zoom problem”?

Thank you for your time.

Hi Antonio,

Both of these issues could be something as simple as having the UI set up differently in your project compared to the sample. With that being said, I would have to see how your project is set up to help you further. If possible, could you create a minimal project showcasing the issue or private message me your project?

Kind regards,
Maverick L.

Hi Maverick,

Thanks to your comment about UI I found a solution for the “Movement problem” by:

I sent you the project via private message.
All you have to do is introduce your Lightship MAP API key and activate the LightshipMap element.

Kind regards,

Hi Antonio,

I’m glad to hear that you were able to get the issue resolved! For the “Canvas Scaler” option, setting it to “Constant Pixel Size” means that every UI element underneath your canvas will maintain its original pixel size. This is okay if you’re only going to use your app on one particular device, but in most cases you’ll have to support displays of various physical sizes and resolutions. To do this, you should make use of anchors, “Scale to Screen,” or roll your own scaling solution.

Perhaps you can change the simulator you’re using when you run your app to test how your UI behaves on different displays?

Hi Maverick,

I understand what you explain about the “Constant Pixel Size”.
I will test in diferents devices inside the simulator as you suggest, thanks.
About your proposal:

To do this, you should make use of anchors, “Scale to Screen,” or roll your own scaling solution.

Can you explain it in more detail? Please.

I forgot to highlight that the “Zoom problem” continues to happen, I am not able to correct it.
For this reason I have sent you the project, for your analysis.
Thanks for your help Maverick.


Anchors, found under the “Rect Transform” section of a UI element in the Inspector, use relative screen-space coordinates (between 0 and 1) to tell Unity how to position, orient, and scale UI elements on the screen. You’re provided a minimum anchor (“Min Anchor”) – the bottom left – and a maximum anchor (“Max Anchor”) – the top right. If both the min and max anchors are the same, we call that a single point anchor. UI elements with single point anchors render at an absolute size offset by the anchors and at the location marked by the anchors.

For instance, let’s say you have a dialog box that you would like to display at the center of the screen. You select the “Constant Pixel Size” setting for Canvas Scaling and set the minimum and maximum anchors to (0.5, 0.5). The width and height you select dictate the size of the dialog box in pixels, no matter what screen your display box is being shown on – this is called absolute or constant scaling.

What about if anchors differ? Anchors that aren’t the same tell Unity to perform scaling in the direction of the anchors: maximum - minimum. The bottom left of the UI element’s rectangle has to match the minimum anchor and the top right of the rectangle must match the maximum anchor. If it helps you, you can think about the relative coordinates as a percentage starting from the viewport origin – the bottom left in Unity’s case. A relative min x-coordinate of 0.1 could be understood as saying you’d like 10% of the screen space to the left of the UI element.

When it comes to your zoom problem, I received your project but have not yet had a chance to investigate the cause.

Hi Maverick,

Have you had time to study the “zoom problem”?